Hi Durga Sir,
In session 71-HDPCD:Spark using Scala, You showed lots of hive functions. But my question is, In HDPCD certification environment.
1.Hive will be installed in given environment?
2.SQLcontext for hive will be available or we need to create it to use hive tables and other functions?
3.I was in impression that they would expect more spark and scala usage but we used hive API’s
4.Which one is better to use in real time projects. Dataframe API or Hive ?
Prepare for certifications on our state of the art labs which have Hadoop, Spark, Kafka, Hive and other Big Data technologies
- Click here for signing up for our state of the art 13 node Hadoop and Spark Cluster