Error while working on pyspark

apache-spark

#1

Hi,

whenever I execute any spark command, below mentioned error lines are appeared on screen automatically, which are causing disturbance in doing practice. please help to fix it.

orders.map(lambda o: o.split(",")[3]).t18/07/21 13:59:27 INFO YarnClientSchedulerBackend: Requesting to kill executor(s) 7
18/07/21 13:59:27 INFO ExecutorAllocationManager: Removing executor 7 because it has been idle for 10 seconds (new desired total will be 0)
18/07/21 13:59:27 INFO YarnClientSchedulerBackend: Disabling executor 7.
18/07/21 13:59:27 INFO DAGScheduler: Executor lost: 7 (epoch 0)
18/07/21 13:59:27 INFO BlockManagerMasterEndpoint: Trying to remove executor 7 from BlockManagerMaster.
18/07/21 13:59:27 INFO BlockManagerMasterEndpoint: Removing block manager BlockManagerId(7, wn01.itversity.com, 41162)
18/07/21 13:59:27 INFO BlockManagerMaster: Removed 7 successfully in removeExecutor
18/07/21 13:59:27 INFO YarnScheduler: Executor 7 on wn01.itversity.com killed by driver.
18/07/21 13:59:27 INFO ExecutorAllocationManager: Existing executor 7 has been removed (new total is 0)


#2

Hi @Akashivu,
Use the below command to stop these kind of issues.
sc.setLogLevel("ERROR")


#3

it works for me… thanks @Sravan_Kumar


#4