Spark servers seems to be down for 16 hours now...Any ETA?

apache-spark

#1

Spark servers seems to be down !.. Any ETA ?


#2

They are running now. If it goes down also, spark services won’t be affected. You can ignore and continue using Labs.


#3

I don’t get prompt… (STATE=ACCEPTED) keeps coming…
I give the following command
pyspark --master yarn --conf spark.ui.port=12888

Anything else that I need to give ?


#4

@Gurpreet_Singh Issue Resolved. Now try to launch pyspark.


#5