Pyspark not working, it gives error


#1

I gave below command to start pyspark
pyspark --master yarn --num-executors 1 --executor-memory 512M --conf spark.ui.port=shuf -i 12000-65000 -n 1

and it gave big error , and when i try to create sc context it doesnt allow

[smohank7@gw01 ~]$ pyspark --master yarn --num-executors 1 --executor-memory 512M --conf spark.ui.port=shuf -i 12000-65000 -n 1
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Python 2.7.5 (default, Sep 15 2016, 22:37:39)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.
Java HotSpot™ 64-Bit Server VM warning: INFO: os::commit_memory(0x00006ef645000000, 716177408, 0) failed; error=‘Cannot allocate memory’ (errno=12)

There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (mmap) failed to map 716177408 bytes for committing reserved memory.

An error report file with more information is saved as:

/home/smohank7/hs_err_pid13223.log

Traceback (most recent call last):
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/shell.py”, line 43, in
sc = SparkContext(pyFiles=add_files)
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/context.py”, line 112, in init
SparkContext._ensure_initialized(self, gateway=gateway)
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/context.py”, line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File “/usr/hdp/2.5.0.0-1245/spark/python/pyspark/java_gateway.py”, line 94, in launch_gateway
raise Exception(“Java gateway process exited before sending the driver its port number”)
Exception: Java gateway process exited before sending the driver its port number

i tried just pyspark --master yarn – this also doesnt work


#2

@smohank Issue has taken care already. please check now.