New User: Steps to Transfer Files


Hello Developers !

I’m newbee with only SQL Background in windows based tools. New to the Linux Environment need your help in how to transfer files from my windows machine to the Linux Bigdata Terminal.

I have had experience using cloudera VM, but thought this setup with help me more to explore real time setup. In VM I use to download the file directly from the Browser.

Watched the into from Durga on how to download from GitHub and looks like its still not working. Need your some help on step by step instructions on how to transfer the files. Thanks in Advance.



@Srinivas_Kolli u can use Winscp software to transfer files from windows to linux terminal

1 Like


I wasn’t sure on this process either , until I tried .Its simple, but below details may help further

Once WinScp is installed on Windows OS , login to it with details as below:

1)Click ‘Edit’ 2)Copy the password from itversity lab homepage, and paste it as password. 3)Save and ok
4)Click on Login (login credentials will be verified) 5)paste the password again and OK
Left side will be Windows file structure. Right side will be VM . drag and drop the files, for transfer.

1 Like


Thanks Rahul & Vijetha. My Next Question How do we Transfer to HDFS file system.
While using VM i use to use ( hadoop dfs -put /souce / destination )
Now with Web Console what is the mechanism ? Can we create our own folders in HDFS ? Tried but it did not work. Please guide.



@Srinivas_Kolli You cannot create folders in HDFS using Winscp software.
You can use command line to create folders in HDFS (hadoop fs -mkdir destination)
If you have Ambari(Hortonworks) you can create folders from it.
If you ahve HUE you can create folders in HDFS

1 Like


Thank you for the Ambari Tip. I was trying to use mkdir on command line and it did not work.
As a first step I created the folders in Ambari and then used the below command line statement to to move the file.

hadoop dfs -put /home/UserName/data/cards/deckofcards.txt /user/UserName/HiveData



@Srinivasi it will be good if you do it by seeing basic hdfs command videos.



@Vijetha Alternative command line way: If you are familiar with scp command that would also help you to transfer the files to Linux/remote server.

scp [file_to_be_transferred] user_id@Server_name:[Target_directory_location]

scp file1.txt root@clouderaVM:/user/cloudera/files/input

1 Like


I am behind a restrictive firewall. I am not able to use scp. I tried to mail to and it was successfully sent but I am not receiving it in the box using the mail command in the shell. I can send a mail from the shell to my other outlook account. That works fine and I confirmed the address.

Bottom linw, I need to be able to transfer my files to the edge node server and execute the code.



I am trying to move file from unix local to hdfs directory but it’s not working.
scp /home/vikct001/user/vikrant/inputfiles/email.csv /user/vikct001/dev/hadoop/external/files

/user/vikct001/dev/hadoop/external/files is my hdfs location. It says
/user/vikct001/dev/hadoop/external/files: No such file or directory