Not able to connect to Pyspark


I am not able to connect to pyspark - I ran the below commands and got the Exception Error.

Thanks for the help!

[premkilaru@gw02 ~]$ export SPARK_MAJOR_VERSION=2
[premkilaru@gw02 ~]$ pyspark -master yarn -conf spark.ui.port=12888
SPARK_MAJOR_VERSION is set to 2, using Spark2
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(
at org.apache.spark.launcher.Main.main(

Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster



Use double hyphens instead of single hyphen

pyspark --master yarn --conf spark.ui.port=12888