I want to learn how the spark jobs are executed on the cluster but the jobs that are being executed in spark shell are executing in the single driver. as the master is local and deploy mode is by default client. Can anyone help me in how to open my spark shell in cluster mode. I am using the scala spark.
@Kedarkumar_Golla Use below command to launch spark shell in cluster mode.
spark-shell --master yarn --conf spark.ui.port=12335