pySpark - Error executing sqlContext

Using Python version 2.7.5 (default, Sep 15 2016 22:37:39)
SparkContext available as sc, HiveContext available as sqlContext.

from pyspark.sql import SQLContext
from pyspark.sql import HiveContext
sqlContext = HiveContext(sc)
depts = sqlContext.sql(“select * from departments”)

Console output shows the following error message -->
py4j.protocol.Py4JJavaError: An error occurred while calling o38.sql.
: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table departments. Permission denied: user=ganesh1146, access=EXE
CUTE, inode="/user/vishaljoneja/sqoop_import/departments":vishaljoneja:hdfs:drwx------

@ganesh1146 - Create tables in your databases instead of default database.

once you created then use as per below
depts = sqlContext.sql(“select * from ganesh1146.departments”)