Spark utilities not getting started fully

when i try giving the below command (spark shell --master yarn…) many other utilities of spark are not getting started. so when i try joins b/w orders and orderItems using reduce by Key there is no stage 2 happening when i see in spark jobs in DAG Visualization. Please let em know how to solve this issue.

[hit2jai2017@gw01 ~]$ spark-shell --master yarn-client --conf spark.ui.port=14532
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.2
/
/

Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
Spark context available as sc.
SQL context available as sqlContext.