Hadoop Installation in Red Hat Linux through Cloudera Manager - Installation Path B

#1

Thanks in advance for all your help.

I am trying to setup 7 node cluster using Cloudera Manager - Installation Path B
When it try to install HDFS, I am receiving following error in all the data nodes.
Namenode, secondary namenode also started fine. But none of the datanodes not installing properly and receiving below error.

Can’t open /var/run/cloudera-scm-agent/process/47-hdfs-DATANODE/supervisor.conf: Permission denied.

  • make_scripts_executable
  • find /var/run/cloudera-scm-agent/process/47-hdfs-DATANODE -regex ‘.*.(py|sh)$’ -exec chmod u+x ‘{}’ ‘;’
  • ‘[’ DATANODE_MAX_LOCKED_MEMORY ‘!=’ ‘’ ‘]’
0 Likes

#2

What is the OS username you are using?

0 Likes

#3

Hi Arvind,

It seems DATANODE_MAX_LOCKED_MEMORY ‘!=’ ‘’ ’ has not been set in hdfs-site.xml.

Pls check first “ulimit -l” at OS level. Then add the entry at OS in /etc/security/limits.conf file as:
hdfs - memlock 130

and then update the property as below:
dfs.datanode.max.locked.memory = 128000 bytes

NOTE: Above values depends on operating system and its distribution.
hdfs-site.xml property value (in bytes) must be less than “ulimit -l” value (in KB).

0 Likes