Not able to open pyspark


#1

Team,

I am unable to open pyspark, please see the screen shot for details.

command: pyspark --master yarn --conf spark.ui.port=12562 --executor-memory 2G --num-executors 1


#2

There is some technical issue, we are working on that. Sorry for the Inconvenience.


#3

It is working fine now. All the services are working fine.


#4

Just tried again and getting the same message


#5

@akalita We are checking that. Now, spark-shell is fine, pyspark is having some issue. We will update once it is resolved. Sorry for the inconvenience again.


#6

Now it is working fine. Please check and let us know.


#7

Thanks @vinodnerella, it is woking fine now


#8