Pyspark exception while loading

org.apache.spark.SparkException: Dynamic allocation of executors requires the external shuffle service. You may enable this through spark.shuffle.service.enabled.

Getting this exception while trying to launch pyspark

Tried with cygwin but getting the same error

@brkanth

I have checked from my side its working now.Thank you for your patience.