Launching spark : Get error Failed to send RPC to wn04.itversity.com

I tried to launch spark using below command:
pyspark --master yarn --conf spark.ui.port=12577 --num-executors 2 --executor-memory 512

sparkContext got created, but after sometime i m getting below error on console:

SparkContext available as sc, HiveContext available as sqlContext.

20/08/26 05:46:55 ERROR YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
20/08/26 05:46:55 ERROR TransportClient: Failed to send RPC 4750050639454937952 to wn04.itversity.com/172.16.1.107:55403: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException

Hi @Amit_Jain,

try to give unit while passing value in –executor-memory parameter.

use below command to lauch your pyspark shell-

pyspark --master yarn --conf spark.ui.port=12577 --num-executors 2 --executor-memory 512mb