I am getting the below error when I try to launch Pyspark. can some one take a look?
srimakurthi@gw01 ~]$ pyspark
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Python 2.7.5 (default, Sep 15 2016, 22:37:39)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.
Java HotSpot™ 64-Bit Server VM warning: INFO: os::commit_memory(0x00007c79c4800000, 716177408, 0) failed; error=‘Cannot allocate memory’ (errno=12)
There is insufficient memory for the Java Runtime Environment to continue.
Native memory allocation (mmap) failed to map 716177408 bytes for committing reserved memory.
An error report file with more information is saved as:
Traceback (most recent call last):
File “/usr/hdp/18.104.22.168-1245/spark/python/pyspark/shell.py”, line 43, in
sc = SparkContext(pyFiles=add_files)
File “/usr/hdp/22.214.171.124-1245/spark/python/pyspark/context.py”, line 112, in init
File “/usr/hdp/126.96.36.199-1245/spark/python/pyspark/context.py”, line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File “/usr/hdp/188.8.131.52-1245/spark/python/pyspark/java_gateway.py”, line 94, in launch_gateway
raise Exception(“Java gateway process exited before sending the driver its port number”)
Exception: Java gateway process exited before sending the driver its port number