Cannot launch spark-context with new SparkContext(conf) in the cluster

#1

Hi @itversity,

Can you please tell why my spark-context (sc) cannot be reconfigured by using the below commands:
1.
spark-shell
–master yarn
–conf spark.ui.port=12345

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

val conf= new SparkConf().
setMaster(“yarn-client”).
setAppName(“crime types in residence”)

val sc1= new SparkContext(conf)


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster

0 Likes

#2

When you launch spark-shell, it will automatically have context with it.
You don’t need to create spark context the way you does.

If you are developing IDE, then you have to create Spark Context to use APIs. Not sure what made you directly develop using spark-shell. It is not the right approach.

I would highly recommend you to sign up for a course or follow YouTube playlist. You need to start using IDEs on your local system for development.

@annapurna @BaLu_SaI @hemanthvarma @Ramesh1

0 Likes

#3

Hi @itversity,

Yes i already know that, and I have already completed the training, but I was just trying to manually reconfigure the SparkContext as I am exploring all scenarios for my certification exam next week.
Also,
this approach has been shown in your course “HDPCD: spark using scala course” only:

HDPCD: spark using scala-> Section 4: core spark transformation and actions with adva… -> video 35, 36.

Thanks and Regards,
Sabby

0 Likes