Unable to locate tracking URL on initiating spark shell

Hi,
On initiating spark shell with the following command, it does not display the tracking URL.
In fact, the logs generated on the prompt is short when compared the logs displayed in the videos.

Can you please help ?

spark-shell --master yarn
–conf spark.ui.port=12687
–num-executors 1
–executor-memory 512M

[jshivakumaar@gw02 ~]$ spark-shell --master yarn \

–conf spark.ui.port=12687
–num-executors 1
–executor-memory 512M
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.3
/
/
Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
Spark context available as sc.
SQL context available as sqlContext.

scala> val orders_rdd = sc.textFile("/public/retail_db/orders/")

Regards,
Shiva


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster

Our admin has disabled extensive logging due to which the tracking url is not generated.
You can go to resource manager and track from there.

Here is the URL for resource manager - http://rm01.itversity.com:19088/cluster

Thank you for the quick response. This helps.

Regards,
Shiva