Error - While launching pyspark shell


#1

Hi,

I was running pyspark, referring the tutorial video -

I have used the below command :

pyspark --master yarn
–conf spark.ui.port=12569
–num-executors 2
–executor-memory 512 mb

Error I received -

[dasjeevan@gw03 ~]$ pyspark --master yarn \

–conf spark.ui.port=12569
–num-executors 2
–executor-memory 512 mb
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
Exception in thread “main” java.lang.IllegalArgumentException: pyspark does not support any application options.
at org.apache.spark.launcher.CommandBuilderUtils.checkArgument(CommandBuilderUtils.java:254)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildPySparkShellCommand(SparkSubmitCommandBuilder.java:241)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:117)
at org.apache.spark.launcher.Main.main(Main.java:87)
[dasjeevan@gw03 ~]$

Can you please guide me -
For spark 1.6
==========

   - How to lunch pyspark with spark1.6 cluster  ?
   -  How to submit a script in spark1.6 cluster with spark-submit.
  
  For spark 2
 ==========
   - How to lunch pyspark with spark2 cluster  ?
   -  How to submit a script in spark2 cluster with spark-submit.

Also I found that the RM specification shows that the cluster is of 5 nodes, but in the below signature message it is showing up as 13 nodes… Just trying to understand why it so ?

Waiting for your response at earliest.

Warm Regards,
Jeevan Das
Cell : 9886618820


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster


#2

@Jeevan_Das In Labs Default version of spark is spark1.6
To launch pyspark use below command.

pyspark --master yarn --conf spark.ui.port=12569 --num-executors 2 --executor-memory 512mb

For launching spark2 run the below command first

export SPARK_MAJOR_VERSION=2
pyspark --master yarn --conf spark.ui.port=12569 --num-executors 2 --executor-memory 512mb

Note:
Don’t give any space between 512 and mb


#3

Thanks You Annapurna.

But I think recently spark2 also upgraded. How to access spark2 ?

Regards,
Jeevan


#4

Use this command to launch spark 2.


#5

I appolize. I missed to see it earlier.


#6