Getting error like "KeyError: 'SPARK_HOME'"



Hi Team,

I am Trying to run first program in pyspark using phcharm IDE, and I am getting error like below

"Traceback (most recent call last):
File “C:/Users/Admin/PycharmProjects/Pandi_Python/”, line 3, in
sc = SparkContext(master = “local”,appName = “spark_demo”)
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\”, line 112, in init
SparkContext._ensure_initialized(self, gateway=gateway)
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\”, line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\”, line 48, in launch_gateway
SPARK_HOME = os.environ[“SPARK_HOME”]
File “C:\Users\Admin\PycharmProjects\Pandi_Python\venv\lib\”, line 425, in getitem
KeyError: ‘SPARK_HOME’

Process finished with exit code 1"

what is this error means… How to resolve this error

from pyspark import SparkContext

sc = SparkContext(master = “local”,appName = “spark_demo”)

Note: I have used SparkConf option in import as well…

Eventhough Am getting same error

Kindly help me to resolve this error

Thanks & Regards
pandia Lakshmanan



Can you check the configurations that you have set up for spark_home?


I have configure HADOOP_HOME in my system… could you explain how to set spark_home ?



After installing spark set the path as SPARK_HOME as mentioned in this BLOG


Thanks for your advice Sunil I will do that and let you know