Unable to restart SparkContext after sc.stop

apache-spark

#1

Hi,

I wanted to set my own properties for sc, hence i did sc.stop() after i logged into spark-shell. However it throwing an error when i set new properties.
This is how I have proceeded:

spark-shell --master yarn --deploy-mode client --conf spark-ui-port=12355 --num-executors 1 --executor-memory 2048M
sc.stop
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
val conf = new SparkConf().setMaster(“yarn-new”).setAppName(“Practice”)
val sc = new SparkContext(conf)

when i try to read a file ; it says No active SparkContext.

Kindly tell me where I am going wrong; also, how do i restart sparkcontext.

Please find the screenshots attached.
image
image
image
image
image
image

Kindly help me resolve this.
Thanks,
Ashma.


#2

@ashmapoonacha In the log, I can see an error with yarn-new. Try by giving yarn-client

image

val conf=new SparkConf().setMaster("yarn-client").setAppName("test")

And the data should be from HDFS

val orders1=sc.textFile("HDFS path")


#3

Thank u Balu.

  1. Yes, it has to be either yarn-client or yarn-cluster.
  2. It worked fine when i read data from HDFS location. '
    OR i had to use setMaster(“local”) to read data from Local using ->
    sc.textFile(“file:///path to the file/”).

Thanks again!


#4