Unable to run pyspark command

I’m trying to run below command but i’m receiving this error.

Spark ver 1.6.2. Is this the right command?.

pyspark --master yarn --conf spark.ui.port=12562 --execution-memory 2G --num-executors 1

Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:254)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:241)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:117)
at org.apache.spark.launcher.Main.main(Main.java:87)

I think there are 2 versions of spark installed and the default its taking may not be 1.6.
Check for different versions of pyspark or spark by following command

sudo find / -name ‘pyspark’ or ‘spark’

if you find 2 diff versions use the appropriate one with entire path like /usr/hdp/2.4.2.0-258/spark/python/pyspark

[shantanil@gw03 ~]$ sudo find / -name ‘pyspark’ or ‘spark’
[sudo] password for shantanil:
shantanil is not in the sudoers file. This incident will be reported.
[shantanil@gw03 ~]$

[shantanil@gw03 ~]$ pyspark –master yarn --conf spark.ui.port=12888
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:254)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:241)

[shantanil@gw03 ~]$ export SPARK_MAJOR_VERSION=2
[shantanil@gw03 ~]$ pyspark –master yarn --conf spark.ui.port=12888
SPARK_MAJOR_VERSION is set to 2, using Spark2
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:253)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:290)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:147)
at org.apache.spark.launcher.Main.main(Main.java:87)
[shantanil@gw03 ~]$ export SPARK_MAJOR_VERSION=1
[shantanil@gw03 ~]$ pyspark –master yarn --conf spark.ui.port=12888
SPARK_MAJOR_VERSION is set to 1, using Spark
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:254)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:241)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:117)
at org.apache.spark.launcher.Main.main(Main.java:87)
[shantanil@gw03 ~]$ sudo find / -name ‘pyspark’ or ‘spark’
[sudo] password for shantanil:
shantanil is not in the sudoers file. This incident will be reported.
[shantanil@gw03 ~]$