Unable to launch pyspark on specific yarn port

$pyspark --master yarn --conf spark.ui.port= 128888

Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:254)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:241)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:117)
at org.apache.spark.launcher.Main.main(Main.java:87)

Note:
Already took reference from previous issue raised:

It doesn’t seems to be an issue due to two versions, please check the exception raised.Also tried using another port, it didn’t worked.

@Akshay_Jain_Dhotia There should not be any space between = and port number

Try below command and let us know:

pyspark --master yarn --conf spark.ui.port=128888

Thanks, it solved the issue after keeping port number of 5 digits
:slight_smile: