Spark-shell getting killed randomly

spark-shell

#1

hi guys,
randolmly I’ve seen my spark-shell get killed

I am using it via ssh

scala> l_numbersrddflatmap.map(word => ("ey/usr/hdp/current/spark-client/bin/spark-shell: line 41: 18257 Killed                  "$FWDIR"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"

I was just typung something in the spark shell and I see this suspisuocus msg or killed proecss…
ps I had only 1 terminal opened so I am not sure what happened…
#annoying

ps

[mamatucci@gw01 ~]$ ps -aux
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
mamatuc+  7304  0.0  0.0 153088  1604 pts/50   R+   14:04   0:00 ps -aux
mamatuc+ 17880  0.0  0.0 149900  1960 ?        R    13:19   0:00 sshd: mamatucci@pts/50
mamatuc+ 17949  0.0  0.0 149924  1716 ?        S    13:19   0:00 sshd: mamatucci@notty
mamatuc+ 17953  0.0  0.0  52696   604 ?        Ss   13:19   0:00 /usr/libexec/openssh/sftp-server
mamatuc+ 17985  0.0  0.0 116664  2472 pts/50   Ss   13:19   0:00 -bash
mamatuc+ 28257  0.7  2.9 7068800 967784 ?      Tl   10:41   1:36 /usr/jdk64/jdk1.8.0_77/bin/java -Dhdp.version=2
[mamatucci@gw01 ~]$

maybe useful 4 u

  • I run my ss as
spark-shell --master yarn --num-executors 1 --executor-memory 512M --conf spark.ui.port=`shuf -i 12000-65000 -n 1`

#2

interesting… I just retried to run the shell

Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x0000686704c00000, 716177408, 0) failed; error='Cannot allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 716177408 bytes for committing reserved memory.
# An error report file with more information is saved as:
# /home/mamatucci/hs_err_pid10651.log


#3

solution

pkill -9 -u `id -u username`

some crap left around from zombies…


#4