Cannot launch pyspark: memory error


#1

Hello,
Please help me on this error. Can’t launch pyspark.
Thanks!

[adas_iitm@gw03 products]$ pyspark
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Java HotSpot™ 64-Bit Server VM warning: INFO: os::commit_memory(0x00000003d5600000, 702021632, 0) failed; error=‘Cannot allocate memory’ (errno=12)
[adas_iitm@gw03 products]$ free -m
total used free shared buff/cache available
Mem: 64163 59953 355 3329 3855 118
Swap: 1021 1021 0


#2

@adas_iitm issue has taken care already.