STARTUP_MSG: classpath error recived from HADOOP NAMENODE -FORMAT command. PLEASE HELP >

Hi
During Hadoop 2.7.3 Installation, whlie triggering HADOOP NAMENODE -FORMAT command. I am getting following CLASSPATH error[hduser@storage hadoop]$ hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.17/01/09 09:44:42 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = storage/192.168.0.227
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.7.3
STARTUP_MSG: classpath =

/home/hduser/hadoop/etc/hadoop:/home/hduser/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/junit-4.11.jar:/home/hduser/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-auth-2.7.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/cur
BELOW IS WHAT I DID FROM MY END

  1. VERIFIED CORRECT JAVA LOCATION PATH
    If you look at the above error messages, its seems to be JAVA PATH location error

Therefore in order to define the exact java location , you can trigger following command :-
[hduser@storage ~]$ readlink -f /usr/bin/java
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.101-3.b13.el6_8.x86_64/jre/bin/java

[hduser@storage hadoop]$ which java
/opt/jdk1.8.0_111/bin/java
[hduser@storage hadoop]$ java -version
java version "1.8.0_111"
Java™ SE Runtime Environment (build 1.8.0_111-b14)
Java HotSpot™ 64-Bit Server VM (build 25.111-b14, mixed mode)
[hduser@storage hadoop]$

From above output, You can take only this line --> /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.101-3.b13.el6_8.x86_64/
[hduser@storage hadoop]$ echo $JAVA_HOME
/usr/lib/jvm/jre-1.8.0-openjdk.x86_64/
[hduser@storage hadoop]$

  1. DEFINE JAVA HOME LOCATION INSIDE HADOOP-ENV.SH
    [hduser@storage ~]$ vi hadoop-env.sh
    Add following line
    export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.101-3.b13.el6_8.x86_64/

SAVE IT…After that type following command again at 3

  1. RAN HADEOOP NAMENODE - FORMAT COMMAND but received above classpath error.

Any help will be highly appreciated.
Thank You
Ujjwal Rana

Where are you running this? Quickstart VM or Sandbox or some other environment?

Hi
Thanks for the response. I am using virtual machine VMWARE and inside VMWARE…OS is ORACLE ENTERPRISE LINUX 6.7 version installed.

It is working perfectly in hadoop-1.2.1 version but on hadoop-2.7 version its not working. I even re downloaded hadoop-2-7 tar.gz file many times and have tried almost from last three days but still i could not solved it.

Will be thankful if you could help me.

With Best

Ujjwal Rana

What is the error you are getting?

Please find the error output from below.

[hduser@storage hadoop]$ hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

17/01/06 19:14:03 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = storage/192.168.0.227
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.7.3
STARTUP_MSG: classpath = /home/hduser/hadoop/etc/hadoop:/home/hduser/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/junit-4.11.jar:/home/hduser/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-auth-2.7.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/curator-client-2.7.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/home/hduser/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/curator-framework-2.7.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/home/hduser/hadoop/share/hadoop/common/lib/activation-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/hduser/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/home/hduser/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/home/hduser/hadoop/share/hadoop/common/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/hduser/hadoop/share/hadoop/common/lib/httpclient-4.2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/hduser/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/home/hduser/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/hduser/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/hadoop-annotations-2.7.3.jar:/home/hduser/hadoop/share/hadoop/common/lib/httpcore-4.2.5.jar:/home/hduser/hadoop/share/hadoop/common/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/hduser/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/home/hduser/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/home/hduser/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/common/lib/jsch-0.1.42.jar:/home/hduser/hadoop/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar:/home/hduser/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.7.3-tests.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-nfs-2.7.3.jar:/home/hduser/hadoop/share/hadoop/common/hadoop-common-2.7.3.jar:/home/hduser/hadoop/share/hadoop/hdfs:/home/hduser/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar:/home/hduser/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.3.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.7.3-tests.jar:/home/hduser/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.7.3.jar:/home/hduser/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.3-tests.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.3.jar:/home/hduser/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.3.jar:/home/hduser/hadoop/contrib/capacity-scheduler/.jar:/home/hduser/hadoop/contrib/capacity-scheduler/.jar
STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r baa91f7c6bc9cb92be5982de4719c1c8af91ccff; compiled by ‘root’ on 2016-08-18T01:41Z
STARTUP_MSG: java = 1.8.0_111
/
17/01/06 19:14:09 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
17/01/06 19:14:09 INFO namenode.NameNode: createNameNode [-format]
17/01/06 19:14:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Formatting using clusterid: CID-bdd72531-6e00-4efd-86ff-5d8da839e5d5
17/01/06 19:14:10 INFO namenode.FSNamesystem: No KeyProvider found.
17/01/06 19:14:10 INFO namenode.FSNamesystem: fsLock is fair:true
17/01/06 19:14:11 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
17/01/06 19:14:11 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
17/01/06 19:14:11 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
17/01/06 19:14:11 INFO blockmanagement.BlockManager: The block deletion will start around 2017 Jan 06 19:14:11
17/01/06 19:14:11 INFO util.GSet: Computing capacity for map BlocksMap
17/01/06 19:14:11 INFO util.GSet: VM type = 64-bit
17/01/06 19:14:11 INFO util.GSet: 2.0% max memory 966.7 MB = 19.3 MB
17/01/06 19:14:11 INFO util.GSet: capacity = 2^21 = 2097152 entries
17/01/06 19:14:11 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
17/01/06 19:14:11 INFO blockmanagement.BlockManager: defaultReplication = 1
17/01/06 19:14:11 INFO blockmanagement.BlockManager: maxReplication = 512
17/01/06 19:14:11 INFO blockmanagement.BlockManager: minReplication = 1
17/01/06 19:14:11 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
17/01/06 19:14:11 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
17/01/06 19:14:11 INFO blockmanagement.BlockManager: encryptDataTransfer = false
17/01/06 19:14:11 INFO blockmanagement.BlockManager: maxNumBlocksToLog = 1000
17/01/06 19:14:11 INFO namenode.FSNamesystem: fsOwner = hduser (auth:SIMPLE)
17/01/06 19:14:11 INFO namenode.FSNamesystem: supergroup = supergroup
17/01/06 19:14:11 INFO namenode.FSNamesystem: isPermissionEnabled = true
17/01/06 19:14:11 INFO namenode.FSNamesystem: HA Enabled: false
17/01/06 19:14:11 INFO namenode.FSNamesystem: Append Enabled: true
17/01/06 19:14:11 INFO util.GSet: Computing capacity for map INodeMap
17/01/06 19:14:11 INFO util.GSet: VM type = 64-bit
17/01/06 19:14:11 INFO util.GSet: 1.0% max memory 966.7 MB = 9.7 MB
17/01/06 19:14:11 INFO util.GSet: capacity = 2^20 = 1048576 entries
17/01/06 19:14:11 INFO namenode.FSDirectory: ACLs enabled? false
17/01/06 19:14:11 INFO namenode.FSDirectory: XAttrs enabled? true
17/01/06 19:14:11 INFO namenode.FSDirectory: Maximum size of an xattr: 16384
17/01/06 19:14:11 INFO namenode.NameNode: Caching file names occuring more than 10 times
17/01/06 19:14:11 INFO util.GSet: Computing capacity for map cachedBlocks
17/01/06 19:14:11 INFO util.GSet: VM type = 64-bit
17/01/06 19:14:11 INFO util.GSet: 0.25% max memory 966.7 MB = 2.4 MB
17/01/06 19:14:11 INFO util.GSet: capacity = 2^18 = 262144 entries
17/01/06 19:14:11 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
17/01/06 19:14:11 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
17/01/06 19:14:11 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000
17/01/06 19:14:11 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10
17/01/06 19:14:11 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10
17/01/06 19:14:11 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
17/01/06 19:14:11 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
17/01/06 19:14:11 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
17/01/06 19:14:12 INFO util.GSet: Computing capacity for map NameNodeRetryCache
17/01/06 19:14:12 INFO util.GSet: VM type = 64-bit
17/01/06 19:14:12 INFO util.GSet: 0.029999999329447746% max memory 966.7 MB = 297.0 KB
17/01/06 19:14:12 INFO util.GSet: capacity = 2^15 = 32768 entries
17/01/06 19:14:12 INFO namenode.FSImage: Allocated new BlockPoolId: BP-1267260243-192.168.0.227-1483701252055
17/01/06 19:14:12 WARN namenode.NameNode: Encountered exception during format:
java.io.IOException: Cannot create directory /home/hdfs/hadoop_store/hdfs/namenode/current
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:564)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:585)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:992)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1434)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1559)
17/01/06 19:14:12 ERROR namenode.NameNode: Failed to start namenode.
java.io.IOException: Cannot create directory /home/hdfs/hadoop_store/hdfs/namenode/current
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:564)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:585)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:992)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1434)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1559)
17/01/06 19:14:12 INFO util.ExitUtil: Exiting with status 1
17/01/06 19:14:12 INFO namenode.NameNode: SHUTDOWN_MSG:
/

SHUTDOWN_MSG: Shutting down NameNode at storage/192.168.0.227
************************************************************/

You should identify the error and paste it over here.

In this case it is complaining about this directory, /home/hdfs/hadoop_store/hdfs/namenode/current

Probably you might have to delete /home/hdfs/hadoop_store/hdfs/namenode and re-run again.

Just for the confirmation. You mean i should try deleting this folder /home/hdfs/hadoop_store/hdfs/namenode and recreate the folder --> /home/hdfs/hadoop_store/hdfs/namenode again ?

I am not sure, you can give a try.

By the way, why are you setting up plain vanilla hadoop?

So which version do you recommend for the installation. Actually i am new to this but as per spark and hadoop developer certification which version of hadoop is recommended ? or can you recommend any specific version to try with.

Ok, if you are aiming to be developer, I will recommend to either use bigdata-labs.com or setup Cloudera or Hortonworks quick start VM.

Setting up like this will take enormous amount of effort.

Thanks but Just in case if it is to be done manually then which HADOOP version do you recommend for the installation ? I only wanted to get your idea.

Latest version of Hortonworks sandbox or Cloudera quickstart vm should be fine for development.

No i am asking about tar.gz file. If hadoop is to be installed manually then which version do you recommend for example like hadoop-1.2.0. So which one ?

Latest stable version should be fine.

Thanks. Since I am already on middle of the installation. If you could provide me the solution for the above error messages then it will be great help. Thought to complete the issue first which is good i guess. After then only i will download the file which you have suggested. PLEASE HELP

As mentioned before by Itversity team, the problem is with creating the dir “java.io.IOException: Cannot create directory /home/hdfs/hadoop_store/hdfs/namenode/current”. please check the user you are running the namenode -format has right permissions to create this directory. If you have sudo permissions on this server then try running the command with sudo ( sudo hdfs namenode -format).

Good luck with your installation. It’s a tedious process and if you are stuck with any more problems then I would suggest to use HDP/ CDH developer vm’s or bigdata-labs.com, so you can better utilize your valuable time in learning content.

-Thanks

1 Like