Tracking URL for Spark missing

spark-shell --master yarn --conf spark.ui.port=12654
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.3
/
/

Using Scala version 2.10.5 (Java HotSpotâ„¢ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
Spark context available as sc.
SQL context available as sqlContext.

scala>


spark context stop is also not wporking. Here is the example:
scala> sc.stop

scala> sc
res1: org.apache.spark.SparkContext = org.apache.spark.SparkContext@4229b92c

scala>

@Ravinder_Singh
Our admin has disabled extensive logging due to which the tracking url is not generated.
You can go to resource manager and track from there.

Here is the URL for resource manager - http://rm01.itversity.com:19088/cluster

Thanks for the information.