ValueError: Cannot run multiple SparkContexts at once

hive
pyspark
apache-spark

#1

I am facing below error , Can someone help ?

ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=pyspark-shell, master=local[*]) created by init at C:/Users/Sreejesh Sreenivasan/PycharmProjects/sparkDemo/src/main/python/sparkdemo.py:4
17/12/18 13:24:49 INFO SparkContext: Invoking stop() from shutdown hook


#2

Try use sc.stop() before you were trying to create another SparkContext.


#3

Below is the code:
from pyspark import SparkContext,SparkConf
conf = SparkConf().setAppName(“testing”).setMaster(“local”)
conf =SparkContext()
sc=SparkContext(conf=conf)
orders = sc.textFile(“C:\cygwin64\home\Sreejesh Sreenivasan\python\data\retail_db\orders”)
print(orders.count())

where i need to mention sc.stop()


#4

With this you shouldn’t be getting that you error, can you rerun again and paste the screenshot?


#5

“C:\Users\Sreejesh Sreenivasan\PycharmProjects\sparkDemo\venv\Scripts\python.exe” "C:/Users/Sreejesh Sreenivasan/PycharmProjects/sparkDemo/src/main/python/sparkdemo.py"
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
17/12/18 19:28:36 INFO SparkContext: Running Spark version 1.6.3
17/12/18 19:28:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
17/12/18 19:28:36 INFO SecurityManager: Changing view acls to: Sreejesh Sreenivasan
17/12/18 19:28:36 INFO SecurityManager: Changing modify acls to: Sreejesh Sreenivasan
17/12/18 19:28:36 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(Sreejesh Sreenivasan); users with modify permissions: Set(Sreejesh Sreenivasan)
17/12/18 19:28:37 INFO Utils: Successfully started service ‘sparkDriver’ on port 49920.
17/12/18 19:28:37 INFO Slf4jLogger: Slf4jLogger started
17/12/18 19:28:37 INFO Remoting: Starting remoting
17/12/18 19:28:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.2.11:49933]
17/12/18 19:28:37 INFO Utils: Successfully started service ‘sparkDriverActorSystem’ on port 49933.
17/12/18 19:28:37 INFO SparkEnv: Registering MapOutputTracker
17/12/18 19:28:37 INFO SparkEnv: Registering BlockManagerMaster
17/12/18 19:28:37 INFO DiskBlockManager: Created local directory at C:\Users\Sreejesh Sreenivasan\AppData\Local\Temp\blockmgr-786b0546-7510-411f-8b4b-2d6276dbbe7f
17/12/18 19:28:37 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
17/12/18 19:28:37 INFO SparkEnv: Registering OutputCommitCoordinator
17/12/18 19:28:37 INFO Utils: Successfully started service ‘SparkUI’ on port 4040.
17/12/18 19:28:37 INFO SparkUI: Started SparkUI at http://192.168.2.11:4040
17/12/18 19:28:37 INFO Executor: Starting executor ID driver on host localhost
17/12/18 19:28:37 INFO Utils: Successfully started service ‘org.apache.spark.network.netty.NettyBlockTransferService’ on port 49952.
17/12/18 19:28:37 INFO NettyBlockTransferService: Server created on 49952
17/12/18 19:28:37 INFO BlockManagerMaster: Trying to register BlockManager
17/12/18 19:28:37 INFO BlockManagerMasterEndpoint: Registering block manager localhost:49952 with 511.1 MB RAM, BlockManagerId(driver, localhost, 49952)
17/12/18 19:28:37 INFO BlockManagerMaster: Registered BlockManager
Traceback (most recent call last):
File “C:/Users/Sreejesh Sreenivasan/PycharmProjects/sparkDemo/src/main/python/sparkdemo.py”, line 4, in
sc=SparkContext(conf=conf)
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\context.py”, line 112, in init
SparkContext._ensure_initialized(self, gateway=gateway)
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\context.py”, line 261, in _ensure_initialized
callsite.function, callsite.file, callsite.linenum))
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=pyspark-shell, master=local[*]) created by init at C:/Users/Sreejesh Sreenivasan/PycharmProjects/sparkDemo/src/main/python/sparkdemo.py:3

Process finished with exit code 1


#6

Issue resolved after removing the line

conf =SparkContext()


#7