Spark-shell launching isssue

Hi All,

Not able to launch Spark-shell (Scala). And also in pyspark shell , I am getting spark context not defined error.

Issue is fixed. There are too many sessions open and it is running out of memory.

@itversity, Sir what could be a permanent fix for this issue. And what is meant by running out of memory in spark-shell

We need to schedule shell script which will kill inactive sessions.

1 Like

@itversity - getting spark-shell / pyspark launch issue.

@itversity - Again facing spark-shell / pyspark launch issue. Could you please kill inactive sessions. Do we have option to see what are the inactive session and kill them ?

Yes, when we go alive all support members will get access to the cluster.
Also we need to write the script to kill older inactive sessions.