SparkContext vs Hive Context


I launched spark-shell and see the sqlContext is of type Spark Context . When I type sqlContext and hit enter , I get a response its of type hive context. Now I can use sqlContext.sql to pull the data from both Dataframe and hive tables .

Am I missing something? I remember that one of the dataframe videos mentioned creating a new hiveContext variable if I was to pull the data from hive tables but here I am able to use the same sqlContext variable to pull data both from dataframe and hive table. Please suggest.

@ubuntuaws No you are not missing anything.Here spark REPL automatically create a sqlContext which is actually object of hive.HiveContext
So you not need to create. If you try to create another hiveContext you can create using this:
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)