Spark-submit failing as Exception from container-launch

apache-spark

#1

While submitting below command, job is failing.

spark-submit --class wc
–master yarn
–conf spark.ui.port=12360
–num-executors 18
–executor-cores 3
–executor-memory 3584
wc_2.10-1.0.jar /public/randomtextwriter /user/nikkhiel123/wc prod

Below error is coming

17/12/26 07:42:49 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Container marked as failed: container_e18_1507687444776_30194_01_000004 on host: wn05.itversity.com. Exit status: 1. Diagnostics: Exception from container-launch.
Container id: container_e18_1507687444776_30194_01_000004
Exit code: 1


#2

@N_Chakote:

If you copy your commands to notepad and check, you gave single -.

Possible points you check:

  1. You should give – (double hyphen)
  2. Line break () as in Linux way
  3. –jars (if you are passing jar files) [I guess you are passing jar file as an argument] - Not relevant here
  4. Do you think your data set need this much config?

Can you please also provide video/task link if you are referring so it will be quick to understand your issue? Please post if you can able to solve this issue.
Thanks
Venkat


#3

Thanks Venkat,

I have given double --, and Linux line breaks which are not visible here after copying the command ( don’t know why).
I am trying to go through the performance tuning video from Durga.
I will try again tomorrow and let you know the status.


#4

@N_Chakote:

I guess you need to review your configuration. Here, you are consuming total 189GB YARN/Cluster memory but in our lab we have 120GB total memory. (http://rm01.itversity.com:8088/cluster)
Please reduce your config and try. I recommend below config and try. If still this issue persists, please keep posted the updates.

####################################
spark-submit --class wc

–-master yarn

–-conf spark.ui.port=12360

–-num-executors 6

–-executor-cores 2

–-executor-memory 1G
####################################

Thanks
Venkat