Not able to run Spark from Pycharm


I set up Spark 1.6.3 and Python 2.6 and configured Pycharm to run python code for Spark. I could run the same from command line but not from pycharm. Here are the errors.

SparkContext._gateway = gateway or launch_gateway()
SPARK_HOME = os.environ[“SPARK_HOME”]
KeyError: ‘SPARK_HOME’



  1. Are you running PyCharm in the same OS where you have installed Spark & Python??
  2. Is it installed in VM or directly on OS?
  3. Can you post screenshot?


I have the same error as you vtalapaneni - did you manage to solve it?

sc = SparkContext(master=“local”, appName=“Spark Demo”)
SparkContext._ensure_initialized(self, gateway=gateway)
SparkContext._gateway = gateway or launch_gateway()
SPARK_HOME = os.environ[“SPARK_HOME”]
KeyError: ‘SPARK_HOME’

in bashrc i have spark_home defined:

export SPARK_HOME=/Users/mhasse/Documents/spark-1.6.3-bin-hadoop2.6


Hello ,
I have the same issue


I tried to do like this (I set the environment variable “SPARK_HOME” in Edit Configuration).


This issue is because Enviroument Varible are not loaded properly so after setup you need to Reopen the PyCharp so that all Enviroument will get load and it will solve the problem


Similarly you will also need to set up the configuration for winutil for hadoop .
Thanks .


Try File -> Settings -> Project interpreter -> Choosing the proper value from the interpreter dropdown. This resolves the issue