How to open spark2 in python

pyspark

#1

Hi Team,
pyspark2 command is not working. How to open spark2 in python or pyspark2?

The below command taking me to scala shell. Please help.

export SPARK_MAJOR_VERSION=2
spark-shell
–master yarn
–deploy-mode client
–conf spark.ui.port=12335
–num-executors 2
–executor-memory 512M

Thanks,
Amit


#2

@Amit1 For launching pyspark2 run the below commands:

export SPARK_MAJOR_VERSION=2

pyspark --master yarn --conf spark.ui.port=12569


#3

Thanks Annapurna!

Could you please tell me how to set python 3 by default?


#4

@Amit1 To set python3 default. Open your .bashrc file.

Type alias python=python3.6

save and run . .bashrc

Launch python