Pyspark exception while loading

#1

org.apache.spark.SparkException: Dynamic allocation of executors requires the external shuffle service. You may enable this through spark.shuffle.service.enabled.

Getting this exception while trying to launch pyspark

0 Likes

#2

Tried with cygwin but getting the same error

0 Likes

#3

@brkanth

I have checked from my side its working now.Thank you for your patience.

0 Likes