Spark-shell and pyspark are not launching on my node


I am trying to launch spark-shell and pysaprk and both are failing. my user id is rajeshv28 and could you please check it and let me know.



I am also facing the same issue, and my user id is debjanis. Can you please get this fixed.

Hi itversity,

Could you please look into this issue ASAP.


I am also facing the same issue

I am also having the same issue. Getting following :

[prashantchopra@gw01 ~]$ pyspark
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Python 2.7.5 (default, Sep 15 2016, 22:37:39)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.
17/05/03 21:43:41 ERROR SparkUI: Failed to bind SparkUI Address already in use: Service ‘SparkUI’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘SparkUI’ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at Method)

@prashant.x.chopra @Kranthi_Kumar @rajeshv28 @debjani.s
Use the conf parameter while launch spark-shell / pyspark, refer the earlier post.

Thanks, I tried that and was able to launch spark shell! After a while It is working now even without the conf parameter though!