Not able to run Spark from Pycharm

Not able to run Spark from Pycharm
3.5 2

#1

I set up Spark 1.6.3 and Python 2.6 and configured Pycharm to run python code for Spark. I could run the same from command line but not from pycharm. Here are the errors.

SparkContext._gateway = gateway or launch_gateway()
SPARK_HOME = os.environ[“SPARK_HOME”]
return self.data[key.upper()]
KeyError: ‘SPARK_HOME’


#2

@vtalapaneni

  1. Are you running PyCharm in the same OS where you have installed Spark & Python??
  2. Is it installed in VM or directly on OS?
  3. Can you post screenshot?

#3

I have the same error as you vtalapaneni - did you manage to solve it?

sc = SparkContext(master=“local”, appName=“Spark Demo”)
SparkContext._ensure_initialized(self, gateway=gateway)
SparkContext._gateway = gateway or launch_gateway()
SPARK_HOME = os.environ[“SPARK_HOME”]
KeyError: ‘SPARK_HOME’

in bashrc i have spark_home defined:

export SPARK_HOME=/Users/mhasse/Documents/spark-1.6.3-bin-hadoop2.6
export PATH=$PATH:$SPARK_HOME/bin


#4

Hello ,
I have the same issue


#5

Hello,
I tried to do like this (I set the environment variable “SPARK_HOME” in Edit Configuration).


#6

This issue is because Enviroument Varible are not loaded properly so after setup you need to Reopen the PyCharp so that all Enviroument will get load and it will solve the problem