Getting error like "KeyError: 'SPARK_HOME'"

localsetup

#1

Hi Team,

I am Trying to run first program in pyspark using phcharm IDE, and I am getting error like below

"Traceback (most recent call last):
File “C:/Users/Admin/PycharmProjects/Pandi_Python/spark_demo.py”, line 3, in
sc = SparkContext(master = “local”,appName = “spark_demo”)
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\context.py”, line 112, in init
SparkContext._ensure_initialized(self, gateway=gateway)
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\context.py”, line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\java_gateway.py”, line 48, in launch_gateway
SPARK_HOME = os.environ[“SPARK_HOME”]
File “C:\Users\Admin\PycharmProjects\Pandi_Python\venv\lib\os.py”, line 425, in getitem
return self.data[key.upper()]
KeyError: ‘SPARK_HOME’

Process finished with exit code 1"

what is this error means… How to resolve this error

CODE:
from pyspark import SparkContext

sc = SparkContext(master = “local”,appName = “spark_demo”)
print(sc.textFile(“C:\deckofcords.txt”).first())

Note: I have used SparkConf option in import as well…

Eventhough Am getting same error

Kindly help me to resolve this error

Thanks & Regards
pandia Lakshmanan
09043662036


#2

@Pandia_Lakshmanan

Can you check the configurations that you have set up for spark_home?


#3

I have configure HADOOP_HOME in my system… could you explain how to set spark_home ?
Thanks
Pandia


#4

@Pandia_Lakshmanan

After installing spark set the path as SPARK_HOME as mentioned in this BLOG


#5

Thanks for your advice Sunil I will do that and let you know