Problem in setting up pycharm with spark and python

pyspark

#1

Hi,
I am following udemy course for certification with python.
I am having following error while setting up my development env for pyspark using pycharm
C:\Users\pmudgal\PycharmProjects\getting_started\venv\Scripts\python.exe C:/Users/pmudgal/PycharmProjects/getting_started/SparkDemo.py
’“java”’ is not recognized as an internal or external command,
operable program or batch file.
Traceback (most recent call last):
File “C:/Users/pmudgal/PycharmProjects/getting_started/SparkDemo.py”, line 5, in
sc = SparkContext(“Spark Demo”, “local”)
File “C:_tools\spark-1.6.3-bin-hadoop2.6\python\pyspark\context.py”, line 112, in init
SparkContext._ensure_initialized(self, gateway=gateway)
File “C:_tools\spark-1.6.3-bin-hadoop2.6\python\pyspark\context.py”, line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File “C:_tools\spark-1.6.3-bin-hadoop2.6\python\pyspark\java_gateway.py”, line 94, in launch_gateway
raise Exception(“Java gateway process exited before sending the driver its port number”)
Exception: Java gateway process exited before sending the driver its port number

My JAVA_HOME and SPARK_HOME are already set. My path contains
C:_tools\Python27;C:_tools\Java\jdk1.8.0_151\bin;C:_tools\spark-1.6.3-bin-hadoop2.6\bin;C:_tools\hadoop\bin

Any suggestion?


#2

Can you check this? And what is the path where Spark is installed?