Hadoop and Spark not launching

#1

@itversity

I am unable to launch Haoopp/Spark as there is insufficient memory to run Java Run time Environment.
Please see below errors:
[rajsharmaplus@gw01 ~]$ spark-shell
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Java HotSpot™ 64-Bit Server VM warning: INFO: os::commit_memory(0x00006685a7000000, 351272960, 0) failed; error=‘Cannot allocate memory’ (errno=12)1 ~]$

[rajsharmaplus@gw01 ~]$ hadoop fs -ls /
Java HotSpot™ 64-Bit Server VM warning: INFO: os::commit_memory(0x0000728945000000, 351272960, 0) failed; error=‘Cannot allocate memory’ (errno=12)

There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (mmap) failed to map 351272960 bytes for committing reserved memory.

An error report file with more information is saved as:

/home/rajsharmaplus/hs_err_pid4989.log

Can you please provide any update on this?

Regards, Raj

0 Likes

#2

Its working now! Thanks.

0 Likes

#3

If it won’t work then have a try with the below command.
spark-shell --conf “spark.ui.port=10101”

It will work fine.

0 Likes