Pyspark not running

pyspark

#1

Hi ,

I installed Python and pyspark using below videos from youtube playlist of CCA-175

I followed above playlist and have installed JDK , JRE, Python , and setup spark-2.2.0-bin-hadoop2.6. In you tube video it was mentioned to use spark-1.6.3-bin-hadoop2.6 but i thought to use higher version.

Can you tell me how to fix this error below in screenshot. I am not able to see spark running.


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster


#2

I have all path variables set as below

HADOOP_HOME = C:\hadoop
JAVA_HOme = C:\Program Files\Java\jdk1.8.0_18
SPARK_HOME = C:\spark-2.2.0-bin-hadoop2.6
variables path is also set as in below screenshots.

Let me know should i downgrade spark version or i can solve this by something else?