Cannot run spark-shell with master yarn

apache-spark
#1

Hi,

While trying to run spark-shell with “yarn” as master, I am getting the below error:

Exception in thread “main” java.lang.Exception: When running with master ‘yarn’ either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment.

But while running spark-shell normally, it is running fine.
I have searched the internet and they are telling to add the following line to “spark-env.sh”:
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop

please tell me where do i find the spark-env.sh shell script, or do i need to create one and what would be the storage location in that case.
If there is any other way, please let me know that as well.
Eagerly waiting for response.

Thanks and Regards,
Sabyasachi


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster

0 Likes

#2

@Sabyasachi_Some can you share the command which your trying?

0 Likes

#3

@annapurna

Sure, I am using the below command to initiate spark-shell

spark-shell --master yarn

but normal spark-shell doesn’t give any error.

Thanks

0 Likes