Need additional block in hdfs while running sqoop export

Error:
cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot delete /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482646432089_0001. Name node is in safe mode.
The reported blocks 1268 needs additional 39 blocks to reach the threshold 0.9990 of total blocks 1308.

Tried running “hdfs dfsadmin -safemode leave” command but it doesn’t help , getting below error

16/12/24 10:38:00 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482602419946_0007
16/12/24 10:38:00 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot delete /tmp/hadoop-yarn/staging/cloudera/.staging/job_1482602419946_0007. Name node is in safe mode.

Can please help resolve this issue …

After giving " “hdfs dfsadmin -safemode leave” cmd,

Where are you running this? Cloudera Quickstart VM?

Yes I’m running in cloudera VM

Better to restart VM, restart services and then try.

How much memory you have on your laptop and how much you have given to the virtual machine.

I have allocated 40 GB for Cloudera vm.
Seems I have used all of it for blocks .
Now i’m able to do sqoop export by clearing the blocks .Thanks Durga .

Ok, it is storage issue.