$pyspark --master yarn --conf spark.ui.port= 128888
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
Already took reference from previous issue raised:
It doesn’t seems to be an issue due to two versions, please check the exception raised.Also tried using another port, it didn’t worked.