Not able to launch pyspark shell

pyspark

#1

Hi,
When I am trying to launch pyspark shell with below arguments , its getting hanged and shell is not getting opened . It was working couple of hours before.

pyspark --master yarn --conf spark.ui.port=21888 --conf spark.dynamicallocationenabled=false --num-executors 2 --executor-memory 512M

Need support please…


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster