Getting error while running Spark-Scala program in Eclipse whenever passing I/O arguments

Getting error while running Spark-Scala program in Eclipse whenever passing I/O arguments although i have winutils.exe placed in the folder and environment variables are set up correctly. please let me know how to fix this.

java.io.IOException: Could not locate executable C:\winutils\bin\winutils.exe in the Hadoop binaries

Hi Manu,

Keep the path same “C:\winutils\bin\winutils.exe” and set HADOOP_HOME as separate variable in environment variables as shown below and try .

Regards
venkat

Not sure, if this is different in eclipse, but in IntelliJ , we added the below code, where Hadoop-common-2.2.0-bin-master folder in the below codehas the winutils downloaded

def main ( args :Array[String] ) :Unit = {
System.setProperty(“hadoop.home.dir”, “C:\Hadoop-common-2.2.0-bin-master\hadoop-common-2.2.0-bin-master”)

sample ref:
http://teknosrc.com/spark-error-java-io-ioexception-could-not-locate-executable-null-bin-winutils-exe-hadoop-binaries/