Connection issue while launching Spark Shell

#1

Hi Team,

I am facing issue while launching spark-shell. Please look into it on priority.

Thanks,
Dhawal

0 Likes

#2

Please look into this on priority. Issues over the weekend are worse. Please fix this

0 Likes

#3

@Dhawal_Shah

We are looking into this issue.we will get back to you as soon as possible.The command which you are running is

spark-shell --master yarn --conf spark.ui.port=12356 --num-executors 6 --executor-cores 3 --executor-memory 2G

0 Likes

#4

Hello @Dhawal_Shah, from the screenshot shared there is no way to troubleshoot the issue. It will be great if you can share the code in a formatted way.

0 Likes

#5

Thank you @Ramesh1 for pulling the command he is using. @Dhawal_Shah, you will not be able to use more than 2 cores for each executor due to the restrictions in YARN.

Below command is working with out any issues.

spark-shell \
  --master yarn \
  --conf spark.ui.port=12356 \
  --num-executors 6 \
  --executor-cores 2 
  --executor-memory 2G

Spark Executor size have to be in the range of these YARN Container min and max sizes.

0 Likes

#6

Thank you @itversity and @Ramesh1 for looking into it. I am able to launch spark-shell with 2 cores.

Just curious, is this setting changed recently? I once launched spark shell with 3 cores.

Thanks,
Dhawal

0 Likes

#7

I don’t think we have changed recently. Probably you might have launched with out YARN.

0 Likes

closed #8
0 Likes