Please share Environment details of HDPCD spark certification


I want to appear for HDPCD spark certification in this month. I have gone through the Udemy courses prepare by durga sir.

Here my question related to certification environment;

  1. In certification environment sqlContext will be available or do we need to create it?
  2. sqlcontext needs to create from SQLContext or HiveContext ? (i.e. org.apache.spark.sql.hive.HiveContext or org.apache.spark.sql.SQLContext)
  3. In certification, do we have option to select number of executor and cores while launching spark-shell command?

Prepare for certifications on our state of the art labs which have Hadoop, Spark, Kafka, Hive and other Big Data technologies

  • Click here for signing up for our state of the art 13 node Hadoop and Spark Cluster