Error while setup Spark


#1

Hi All,
I am getting the below Error while setting up Spar.

C:\spark-2.2.1-bin-hadoop2.7\bin>pyspark
Python 2.7.14 (v2.7.14:84471935ed, Sep 16 2017, 20:19:30) [MSC v.1500 32 bit (Intel)] on win32
Type “help”, “copyright”, “credits” or “license” for more information.
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/02/13 00:52:33 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Traceback (most recent call last):
File “C:\spark-2.2.1-bin-hadoop2.7\bin…\python\pyspark\shell.py”, line 45, in
spark = SparkSession.builder
File “C:\spark-2.2.1-bin-hadoop2.7\python\pyspark\sql\session.py”, line 183, in getOrCreate
session._jsparkSession.sessionState().conf().setConfString(key, value)
File “C:\spark-2.2.1-bin-hadoop2.7\python\lib\py4j-0.10.4-src.zip\py4j\java_gateway.py”, line 1133, in call
File “C:\spark-2.2.1-bin-hadoop2.7\python\pyspark\sql\utils.py”, line 79, in deco
raise IllegalArgumentException(s.split(’: ', 1)[1], stackTrace)
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating ‘org.apache.spark.sql.hive.HiveSessionStateBuilder’:"

Thanks.