Unable to start pyspark

pyspark-shell

#1

When I launch pyspark, the message “Application report for application” never ends.


Sign up for our state of the art Big Data cluster for hands on practice as developer. Cluster have Hadoop, Spark, Hive, Sqoop, Kafka and more.



#2

What is the command you are using?


#3

pyspark --master yarn --conf spark.ui.port=12344 --num-executors 2 --executor-memory 1024M --executor-cores 2 --packages com.databricks:spark-avro_2_10:2.0.1


#4

I think it is fine now…