Can not access spark-UI Yarn-Ui from itversity labs

Hi Guys,

When I start spark-shell with yarn in gw03, it doesn’t show the spark-ui tracking url…also not able to find the yarn web ui url as well…

[mittal_m2@gw03 ~]$ spark-shell --master yarn --conf spark.ui.port=0 --num-executors 2
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.3
/
/

Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
Spark context available as sc.
SQL context available as sqlContext.
– cahnged port to 12159(5digits) still doesnt show the url…

[mittal_m2@gw03 ~]$ spark-shell --master yarn --conf spark.ui.port=12159 --num-executors 2
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.3
/
/

Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
Spark context available as sc.
SQL context available as sqlContext.
–how to find spark and yarn web ui urls…
Thanks and much appreciated…

Also same is the case with the itversity console…doesn’t show any spark tracking-url…Could it be the issue with the log4j properties…???

[mittal_m2@gw03 ~]$
tors=2l_m2@gw03 ~]$ spark-shell --master yarn --conf spark.ui.port=0 --num-execu
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.3
/
/
Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
Spark context available as sc.
SQL context available as sqlContext.
scala>

Could someone Help here Please!!

@Manish_Mittal

Our admin has disabled extensive logging due to which the tracking url is not generated.
You can go to resource manager and track from there.

Here is the URL for the resource manager - http://rm01.itversity.com:19088/cluster

Thanks a Lot for resolution!!..