Pyspark taking long time to start

Hi ,

for me pyspark is taking unusually lot of time to start … at least 15 minutes. please suggest?

I am using the following command to start pyspark:

pyspark2 --master yarn --conf spark.ui.port=26089 --packages com.databricks:spark-avro_2.11:4.0.0 --jars /usr/share/java/mysql-connector-java.jar --driver-class-path /usr/share/java/mysql-connector-java.jar

Following are some of the messages where it is stuk for 15 minutes before showing the pyspark logo.

20/06/03 22:22:31 WARN Client: Same path resource file:///home/ardeepdas/.ivy2/jars/org.xerial.snappy_snappy-java-1.0.5.jar added multiple times to distributed cache.
20/06/03 22:22:31 WARN Client: Same path resource file:///home/ardeepdas/.ivy2/jars/org.apache.commons_commons-compress-1.4.1.jar added multiple times to distributed cache.
20/06/03 22:22:31 WARN Client: Same path resource file:///home/ardeepdas/.ivy2/jars/org.tukaani_xz-1.0.jar added multiple times to distributed cache.

Thanks in Advance!