I launched spark-shell and see the sqlContext is of type Spark Context . When I type sqlContext and hit enter , I get a response its of type hive context. Now I can use sqlContext.sql to pull the data from both Dataframe and hive tables .
Am I missing something? I remember that one of the dataframe videos mentioned creating a new hiveContext variable if I was to pull the data from hive tables but here I am able to use the same sqlContext variable to pull data both from dataframe and hive table. Please suggest.