SQLContext from pyspark

I am running pyspark commands in labs.
I am getting below message whiel using SQLContext. Kindly help
dept = sqlContext.sql(“select * from departments”)
Traceback (most recent call last):
File “”, line 1, in
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/sql/context.py”, line 580, in sql
return DataFrame(self._ssql_ctx.sql(sqlQuery), self)
File “/usr/hdp/2.5.0.0-1245/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py”, line 813, in call
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/sql/utils.py”, line 51, in deco
raise AnalysisException(s.split(’: ‘, 1)[1], stackTrace)
pyspark.sql.utils.AnalysisException: u’Table not found: departments;’

Pls see last line of error code, it is pointing as table not found. Check the table is there or not.

I checked the table in hive.Table exists . Then only I raised request.

try this once!

from pyspark.sql import HiveContext
sqlContext = HiveContext(sc)

dept = sqlContext.sql(“select *from retail_db.depart_new limit 10”)

dept.show()