Spark-shell issue in labs

apache-spark

#1

spark-shell not connecting
I get this error printed in loop

18/05/23 16:51:45 INFO Client: Application report for application_1525279861629_26435 (state: ACCEPTED)


Sign up for our state of the art Big Data cluster for hands on practice as developer. Cluster have Hadoop, Spark, Hive, Sqoop, Kafka and more.



#2

Hi @jayshawusa

Can you share the screenshot for better understanding to resolve your issue?

Thanks & Regards,
Sravan Kumar


#3

I am also having this same issue! This issue is intermittant. There is no need for a screenshot. I try to launch the spark-shell using:

spark-shell --master yarn \
--conf spark.ui.port=19191 \
--num-executors 1 \
--executor-memory 512M

once it start up, it starts spitting out

INFO Client: Application report for application_1528528985359_0491 (state: ACCEPTED)

over and over and over, and doesn’t stop. The only way to stop it is by ctrl-C, which kills the spark-shell. It doesn’t matter which port number is used; I have tried with all different port numbers. This basically makes the bigdata-labs unuseable for my studies if I can’t start the shell. Please explain how to work around this issue!


#4

FYI, When this issue happened you need to go to ambari and check if YARN is 100% occupied. Issue is now resolved by killing some pending jobs. As there are no resources at that to allocate you it was waiting to get its turn.

INFO Client: Application report for application_1528528985359_0491 (state: ACCEPTED)


#5

getting same error

18/06/10 08:11:26 WARN YarnScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources