Unable to launch pyspark shell

apache-spark
Unable to launch pyspark shell
0.0 0

#1

Hi
I’m facing issues while launching pyspark shell,
Whenever I’m running the command pyspark --master yarn --conf spark.ui.port=12345, seeing the below error

18/03/01 11:09:10 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:11 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:12 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:13 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:14 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:15 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:16 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:17 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:18 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:19 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)
18/03/01 11:09:20 INFO Client: Application report for application_1517228278761_24905 (state: ACCEPTED)


#2

I’m facing the same problem now.

@itversity, could someone look into this at the earliest please. Looks like yarn is not able to handle lot of users at the time. Tried the following from https://stackoverflow.com/questions/30828879/application-report-for-application-state-accepted-never-ends-for-spark-submi,

In the file /etc/hadoop/conf/capacity-scheduler.xml we changed the property yarn.scheduler.capacity.maximum-am-resource-percent from 0.1 to 0.5

But the xml is read-only for obvious reasons.