How do i use latest pyththon say 3.6 in pyspark version 2+ on the lab

i used :export SPARK_MAJOR_VERSION=2: for the pyspark but python is still 2.7
i want to use py 3+ with pyspark


you have to use

export PYSPARK_PYTHON=python3.6



otherwise it will launch Python 2.7 as it is default version.

Hi thanks for the quick response. This is what i am getting.

[diamatic@gw02 ~]$ export PYSPARK_PYTHON=PYTHON3
[diamatic@gw02 ~]$ export MAJOR_SPARK_VERSION=2
[diamatic@gw02 ~]$ pyspark
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
env: PYTHON3: No such file or directory
[diamatic@gw02 ~]$


Sorry its my mistake you have python 3.6 instead of python 3.

this worked for “export SPARK_MAJOR_VERSION=2”