Sandbox HDP - 1 datanode(s) running and 1 node(s) are excluded

I have installed HDP 2.5 from VirtualBox.
I can run all the Web interfaces for Amabari, NameNode, DataNode & all others.

I have a simple java program to copy files from my local windows machine to HDFS.

I am getting below error:
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/anil/wxeventcardschecknotpass.csv could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1641)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3198)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3122)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:843)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:500)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)

My code is:

public class HdfsWrite {
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
conf.set(“dfs.client.use.datanode.hostname”, “true”);
uploadSolution2(conf);
}

private static void uploadSolution2(Configuration conf) throws Exception{
    try {
        String localStr ="C://w2bc//data//wxeventcardschecknotpass.csv";
        String dst = "hdfs://sandbox.hortonworks.com:8020/user/anil/wxeventcardschecknotpass.csv";
        InputStream in = new BufferedInputStream(new FileInputStream(localStr));
        FileSystem fs = FileSystem.get(URI.create(dst),conf);
        OutputStream out = fs.create(new Path(dst));
        IOUtils.copyBytes(in, out, 4096,true);
        System.out.println("success");
    } catch (Exception e) {
        System.out.println(e.toString());
    }
}

}
Please reply, urgent…

Your HDFS seem to be down. You can go to ambari and verify if every thing is up and running or not.

I can see Amabari running well.
I can goto namenode & datanode UI also.

I tries to grep the datanode & namenode tasks they are running fine.

Can you share the screenshot after logging into Ambari?

Ambari

uploaded all snapshots. I am trying to run from windows machine.

Your HDFS is in maintenance mode. Remove it from maintenance mode, you will see the errors.

What is the configuration of your laptop and how much memory you have given to virtual machine?

HDFS is now not running in maintenance mode, still same error.
Laptop is having 16GB RAM & 8 is allocated to virtualbox which should be fine.
Should I run this java program from guest os by uploading jar there from windows, or is it fine it should run from windows directly.

any suggestions on above issue, I am struck on this from long time??

Your namenode might be down. It is in maintenance mode, you have to go to the service and remove it from maintenance mode.

it’s not in maintenance mode & not down also. I can see file is getting created from File Browser but no content inside.

Yes, you have to copy jar file to the guest os and run from there.