Spark 2 support for SQLCont

In the lab while Launching Spark 2
Launches sparkContext only .
sqlContext does not is exists

**spark2-shell --master yarn --conf spark.ui.port=12654 **

SPARK_MAJOR_VERSION is set to 2, using Spark2
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at
Spark context available as ‘sc’ (master = yarn, app id = application_1589064448439_61930).
Spark session available as ‘spark’.
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/ .__/_,// //_\ version

Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_222)
Type in expressions to have them evaluated.
Type :help for more information.

scala> spark
res0: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@16c8e9b8

scala> sqlContext
:27: error: not found: value sqlContext
** sqlContext**

Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster

Hi @Saurabh_Banerjee,

In Spark 2.0.x, the entry point of Spark is SparkSession and that is available in Spark shell as spark , so try this way:


Thankyou … appreciate your help