Spark-shell launching isssue

pyspark-shell
pyspark
#1

Hi All,

Not able to launch Spark-shell (Scala). And also in pyspark shell , I am getting spark context not defined error.

0 Likes

#2

Issue is fixed. There are too many sessions open and it is running out of memory.

0 Likes

#3

@itversity, Sir what could be a permanent fix for this issue. And what is meant by running out of memory in spark-shell

0 Likes

#4

We need to schedule shell script which will kill inactive sessions.

1 Like

#5

@itversity - getting spark-shell / pyspark launch issue.

0 Likes

#6

@itversity - Again facing spark-shell / pyspark launch issue. Could you please kill inactive sessions. Do we have option to see what are the inactive session and kill them ?

0 Likes

#7

Yes, when we go alive all support members will get access to the cluster.
Also we need to write the script to kill older inactive sessions.

0 Likes