Not able to open pyspark



I am unable to open pyspark, please see the screen shot for details.

command: pyspark --master yarn --conf spark.ui.port=12562 --executor-memory 2G --num-executors 1


There is some technical issue, we are working on that. Sorry for the Inconvenience.


It is working fine now. All the services are working fine.


Just tried again and getting the same message


@akalita We are checking that. Now, spark-shell is fine, pyspark is having some issue. We will update once it is resolved. Sorry for the inconvenience again.


Now it is working fine. Please check and let us know.


Thanks @vinodnerella, it is woking fine now