How do i use latest pyththon say 3.6 in pyspark version 2+ on the lab

pyspark-shell
python3
#1

i used :export SPARK_MAJOR_VERSION=2: for the pyspark but python is still 2.7
i want to use py 3+ with pyspark

0 Likes

#2

@Diamatic_Kojo

you have to use

export PYSPARK_PYTHON=PYTHON3.6

before

export MAJOR_SPARK_VERSION=2

otherwise it will launch Python 2.7 as it is default version.

0 Likes

#3

Hi thanks for the quick response. This is what i am getting.

[diamatic@gw02 ~]$ export PYSPARK_PYTHON=PYTHON3
[diamatic@gw02 ~]$ export MAJOR_SPARK_VERSION=2
[diamatic@gw02 ~]$ pyspark
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
env: PYTHON3: No such file or directory
[diamatic@gw02 ~]$

0 Likes

#4

@Diamatic_Kojo

Sorry its my mistake you have python 3.6 instead of python 3.

0 Likes