Cannot launch spark-context with new SparkContext(conf) in the cluster


Hi @itversity,

Can you please tell why my spark-context (sc) cannot be reconfigured by using the below commands:
–master yarn
–conf spark.ui.port=12345

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

val conf= new SparkConf().
setAppName(“crime types in residence”)

val sc1= new SparkContext(conf)

Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster



When you launch spark-shell, it will automatically have context with it.
You don’t need to create spark context the way you does.

If you are developing IDE, then you have to create Spark Context to use APIs. Not sure what made you directly develop using spark-shell. It is not the right approach.

I would highly recommend you to sign up for a course or follow YouTube playlist. You need to start using IDEs on your local system for development.

@annapurna @BaLu_SaI @hemanthvarma @Ramesh1



Hi @itversity,

Yes i already know that, and I have already completed the training, but I was just trying to manually reconfigure the SparkContext as I am exploring all scenarios for my certification exam next week.
this approach has been shown in your course “HDPCD: spark using scala course” only:

HDPCD: spark using scala-> Section 4: core spark transformation and actions with adva… -> video 35, 36.

Thanks and Regards,