Pyspark does not support any application options - RB

Hi I get the below error when I launch pyspark -

[rbyrappa@gw03 ~]$ pyspark --master yarn --config spark.ui.port=12345
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(
at org.apache.spark.launcher.Main.main(

Can you suggest me what I should do?


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster