Not allowing to type a single command fully - PYSPARK


#1

Hi Team,

I started practicising pyspark using the labs.

am continiously getting an issue where in it is not allowing me to type a single statement till the end. Before even I complete getting an error essage as shown below:
Could you pls look into this and fix asap.

>>> orders.map(lambda a: a.split(",")[1].replace("-","")).f18/07/05 20:05:59 WARN SparkContext: Killing executors is only supported in coarse-grained mode
18/07/05 20:05:59 WARN ExecutorAllocationManager: Unable to reach the cluster manager to kill executor driver!

This really annoys and taking out the concenration.
Thanks in Advance.
Regards,
Subbu


#2

Hi @My_Learning

After launching spark use this command sc.setLogLevel(“ERROR”) to stop printing INFO logs.


#3

How are you launching pyspark? Please paste your command.


#4

Thanks. After turnig off the logs, I am able to type the statements… :slight_smile:


#5