Not able to start Spark-shell


I am not able to start Spark shell on ITversity lab. Tried using conf parameter as well “spark-shell --conf spark.ui.port=22322 --master yarn-client”, still facing the same issue. Getting the below error, can anyone please help-

Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Java HotSpot™ 64-Bit Server VM warning: INFO: os::commit_memory(0x00006f75a3000000, 351272960, 0) failed; error=‘Cannot allocate memory’ (errno=12)

Even I am facing the same issue. Did you managed to get it up and running yet?
Dheeraj Rampal

Not yet…Am still facing the same issue!!!

Looks like its working fine now.

yes, it is working. :slight_smile:

We are getting the Following Spark Address already in use: Service ‘SparkUI’ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘SparkUI’ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.

Can some One P Look at it.

@kvlprasad123 Try this command

spark-shell --conf spark.ui.port=22322 spark.port.maxRetries=100 --master yarn-client

1 Like