Help please - How to authenticate into a remote HDFS cluster?

I am trying to read text file from remote hdfs in from my local environment. I could get it successfully run when reading from local filesystem.
Can you please let me know how I can authenticate when connecting to labs hdfs? Here is my code that reads the file:
textFile = sc.textFile(“hdfs://”)

Exception is:
py4j.protocol.Py4JJavaError: An error occurred while calling o20.partitions.
: Call From apples-MacBook-Pro.local/ to failed on connection exception: Connection refused; For more details see:
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

@Rag_Ch, interesting question, I have never tried it. But can you try using hdfs:// and your file location. @itversity, can you let us know how this works. CC @venkatreddy-amalla @gnanaprakasam @perraju

thank you for responding.
I think the issue is ‘authentication’ just like when calling webservice from a remote server. In my case, the Spark client program (of which the above code is) is running on my laptop and trying to read the hdfs located on itversity labs. But am not sure how to pass user credentials. I didn’t find an api to do that. I know there are different security mechanisms like kerberos, password-less authentication etc… so wanted to know how and if there is a way to authenticate into labs using my user credentials. In this sense, even hdfs:// will give the same ‘connection refused’ error as there is no authentication happening…
Btw, if i ftp my code into labs, am confident my code will work as there wont be any authentication needed. Hope some one can throw light on this. CC @venkatreddy-amalla @gnanaprakasam @perraju @itversity