Strange Info displayed after execution of spark command in using Pyspark


Hi Team,

I am using Big Data Labs for hands on practice in spark. Unfortunately after execution of every spark command I am getting the below INFO messages displayed.

18/07/04 01:14:16 INFO YarnClientSchedulerBackend: Requesting to kill executor(s) 7
18/07/04 01:14:16 INFO ExecutorAllocationManager: Removing executor 7 because it has been idle for 10 seconds (new desired total will be 0)
18/07/04 01:14:17 INFO YarnClientSchedulerBackend: Disabling executor 7.
18/07/04 01:14:17 INFO DAGScheduler: Executor lost: 7 (epoch 0)
18/07/04 01:14:17 INFO BlockManagerMasterEndpoint: Trying to remove executor 7 from BlockManagerMaster.
18/07/04 01:14:17 INFO BlockManagerMasterEndpoint: Removing block manager BlockManagerId(7,, 33774)
18/07/04 01:14:17 INFO BlockManagerMaster: Removed 7 successfully in removeExecutor
18/07/04 01:14:17 INFO YarnScheduler: Executor 7 on killed by driver.
18/07/04 01:14:17 INFO ExecutorAllocationManager: Existing executor 7 has been removed (new total is 0)

This is not what is shown in the videos when the instructor in Udemy is using the Spark console for executing the same commands. I am using the CCA-175 course by dgadiraju for hands on in this lab.

I have initialized Py Spark using the below command:
pyspark --master yarn --conf spark.ui.port=12888

PFB a screenshot of the same.



Hi Aditya,

This information is normal to see in spark shell.

You may visit how to suppress spark info messages if you would like.

Hope this helps.


Hi @aditya.chatterjee

Use sc.setLogLevel(“ERROR”) to stop displaying info logs in spark


Hi @Sunil_Abhishek

Is it possible to stop or supprss the INFO messages that occur post the execution of the Spark command. I see when the instructor is demonstrating the commands there are no INFO messages once his command executes and the output is displayed. But in my case there are INFO messages post my output display.