Im facing some issues with pyspark after running a scripts



18/10/09 23:11:14 INFO YarnClientSchedulerBackend: Requesting to kill executor(s) 2
18/10/09 23:11:14 INFO ExecutorAllocationManager: Removing executor 2 because it has been idle for 10 seconds (new desired total will be 1)


Hi @rayudu_darapaneni Can you paste the command that you are using.


In general, whatever the command i run its getting into some issues always. For example,
for i in products.take(10):print(i)
filter(lambda x: x.split(’,’)[4] != ‘’).
map(lambda x: (float(x.split(’,’)[4]),x))
for i in productsMapSort.take(10):print(i)


The command is correct and it is working fine.


yes, that’s true. But lot of unnecessary things are happening after i run any command. Like killing jobs and so on…


Use sc.setLogLevel(“ERROR”) to turn off log infos after launching pyspark