Starting pyspark

pyspark
python
#1

Hello Everybody,

I’m having problem starting pyspark, when I looked in to bashrc file, the python directories are set as below. Is this correct or should I set it to python directory file
I’m using python 2.6.6
Please help me with dought.

export SPARK_CLASSPATH=/usr/share/java/postgresql-jdbc-8.4.704.jar
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"
export PYSPARK_DRIVER_PYTHON=jupyter
export SPARK_CLASSPATH=/usr/share/java/postgresql-jdbc-8.4.704.jar
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"
export PYSPARK_DRIVER_PYTHON=jupyter

0 Likes