SparkUi Finding my job


#1

How do i find my jobs in spark ui

for eg i pickup applicationid from pyspark shell and try to search my job in spark ui i dont find it.

spark.sql(“select partkey,sum(retailprice) from part_table group by partkey”).show(10)
±------±-----------------+
|partkey| sum(retailprice)|
±------±-----------------+
|1107801| 1808.75|
|1107984|1991.9300537109375|
|1108418|1426.3599853515625|
|1108598| 1606.5400390625|
|1109113| 1122.06005859375|
|1109138|1147.0799560546875|
|1109159|1168.0999755859375|
|1110010|1019.9600219726562|
|1110074| 1084.02001953125|
|1110526| 1536.469970703125|
±------±-----------------+
only showing top 10 rows

sc.applicationId
u’application_1533353649473_1775’

i dont find it to be visible either in
http://gw03.itversity.com:18080/?page=3&showIncomplete=true
or on history jobs
http://gw03.itversity.com:18080/?page=3&showIncomplete=false

Can you guys please assist.

regards
sameer


#2

The job usually comes little later but nothing is populated in spark ui neither the executor or dag information, this is kind of confusing.


#3

@Sameer_Rao After launching the spark-shell(spark2) you will find a link with a port number, Paste it in the browser it will open your currently running application UI.