While trying to run spark-shell with “yarn” as master, I am getting the below error:
Exception in thread “main” java.lang.Exception: When running with master ‘yarn’ either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment.
But while running spark-shell normally, it is running fine.
I have searched the internet and they are telling to add the following line to “spark-env.sh”:
please tell me where do i find the spark-env.sh shell script, or do i need to create one and what would be the storage location in that case.
If there is any other way, please let me know that as well.
Eagerly waiting for response.
Thanks and Regards,
Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs
- Click here for access to state of the art 13 node Hadoop and Spark Cluster