HDPCD:Spark - Scala


Originally published at: http://www.itversity.com/courses/hdpcd-spark-scala/

THE HDPCD:SPARK EXAM Hortonworks University is excited to announce a new hands-on, performance-based certification for Spark on the Hortonworks Data Platform (HDP). Our industry-recognized certifications are unique because candidates perform actual tasks on a live installation of our products, instead of simply guessing at multiple-choice questions. Being able to prove your skills allows you to…


Issues related to HDPCD:Spark with Scala can be discussed here.


I have one doubt related to exam of HDPCD-Spark Developer.
We need to do live coding on cluster, so we require to write program in some editor.In your lessions(http://www.itversity.com/lessons/getting-started/),its showing step by step guide,which is very useful & its using eclipse for writing program and compiling using maven.I have question that in exam, we need to allow to use editor (eclipse) or need to use vi editor only and compile scala program manually then create jar and do depolyment etc or we can use zeppline?

Thanks in advance.


They have not shared too many details about the exam.


Setting up Scala, sbt , Eclipse and Integrating SBT with Eclipse instructions are specific to Mac.

Can you provide instructions on how to setup these in Windows as well?



It is not too complex to do it on windows. You can start setting up on windows and ask questions if you are struck some where.

Also you can give a try with IntelliJ IDEA - http://www.itversity.com/topic/scala-setup-development-environment/