No memory after import all tables using sqoop

Hi Durga,

I started practicing your Spark hadoop developer playlist , where currently i used import all tables to import data from mysql to HDFS. now i have few MB space left. i tried removing files from hdfs folder and from trash. My /etc tmp folder had 43 GB and now only 16MB.

what more should be done to get back the memory? Because now i have no space to perform any action on the hadoop system.

Need assistance on this at the earliest. Thanks in advance.

You need to run du -sh /* and see what all directories are using most of the storage.

PS: It is storage issue not memory!!!

Thanks. i did those already anyway. wondering why to use such large data table for demo and make students get stuck. All this made me learn a lot though.