Help please - How to authenticate into a remote HDFS cluster?

@itversity
I am trying to read text file from remote hdfs in labs.itversity.com from my local environment. I could get it successfully run when reading from local filesystem.
Can you please let me know how I can authenticate when connecting to labs hdfs? Here is my code that reads the file:
textFile = sc.textFile(“hdfs://149.56.24.210:50070/user/simhalagna25/demo/data/nyse/NYSE_1997.txt”)

Exception is:
py4j.protocol.Py4JJavaError: An error occurred while calling o20.partitions.
: java.net.ConnectException: Call From apples-MacBook-Pro.local/192.168.3.107 to gw01.itversity.com:50070 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

@Rag_Ch, interesting question, I have never tried it. But can you try using hdfs://nn01.itversity.com:8020 and your file location. @itversity, can you let us know how this works. CC @venkatreddy-amalla @gnanaprakasam @perraju

Hi
thank you for responding.
I think the issue is ‘authentication’ just like when calling webservice from a remote server. In my case, the Spark client program (of which the above code is) is running on my laptop and trying to read the hdfs located on itversity labs. But am not sure how to pass user credentials. I didn’t find an api to do that. I know there are different security mechanisms like kerberos, password-less authentication etc… so wanted to know how and if there is a way to authenticate into labs using my user credentials. In this sense, even hdfs://nn01.itversity.com:8020 will give the same ‘connection refused’ error as there is no authentication happening…
Btw, if i ftp my code into labs, am confident my code will work as there wont be any authentication needed. Hope some one can throw light on this. CC @venkatreddy-amalla @gnanaprakasam @perraju @itversity