I want to appear for HDPCD spark certification in this month. I have gone through the Udemy courses prepare by durga sir.
Here my question related to certification environment;
- In certification environment sqlContext will be available or do we need to create it?
- sqlcontext needs to create from SQLContext or HiveContext ? (i.e. org.apache.spark.sql.hive.HiveContext or org.apache.spark.sql.SQLContext)
- In certification, do we have option to select number of executor and cores while launching spark-shell command?
Prepare for certifications on our state of the art labs which have Hadoop, Spark, Kafka, Hive and other Big Data technologies
- Click here for signing up for our state of the art 13 node Hadoop and Spark Cluster