Pyspark does not come up on


[prashantpr@gw03 ~]$ pyspark --master yarn -conf spark.ui.port=12889
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(
at org.apache.spark.launcher.Main.main(


Try below command to launch pyspark:
pyspark --master yarn --conf spark.ui.port=12889