spark-shell --master yarn --conf spark.ui.port=12654
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
/ / ___ / /
\ / _ / _ `/ __/ '/
// .__/_,// //_\ version 1.6.3
Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
Spark context available as sc.
SQL context available as sqlContext.
spark context stop is also not wporking. Here is the example:
res1: org.apache.spark.SparkContext = org.apache.spark.SparkContext@4229b92c