Sent you a message with my email contact.
Hello there, congrates on clearing the exam. Did you take the exam in scala or spark? Also, which set of series did you follow?
What are delimiters you used apart from below in Sqoop and Spark-shell, please add if anything missing.
\r (carriage return)
What is the delimiter for space any idea?
I used scala (you mean scala or python right ?). Use the below playlist.
Amongst the above, I have used tab, double quotes, single quotes, backslash. Haven’t tried other delimiters yet. space might not need an escape character i guess. I haven’t tried out yet. You can try and let other know how it works.
Thank you so much,
yes, I meant scala or python
One more question, did u have to start the spark-shell in the local mode like you start in the VM installed in your PC/mac or in cluster using yarn master?
Do you have to make a jar or something or can you do everything in spark-shell?
I am working on the problems from Arun’s blog, he is doing everything int he spark-shell. Does that environment work in the exam too.
thanks in advance
Hi Bala, congratulations!
Could you please provide me your contact details, I have some questions about exam. Appreciate your help.
I started the spark shell in yarn mode but I don’t think its required. You can just use spark-shell in the exam.
You need not build any JAR in the exam. All problems can be solved in REPL in one or two lines of code.
Yes, it should work.
Thank you Bala for clarification.
Congrats! And thanks for clarifying all the doubts, it really helped. Am taking in next week. Would like to have a quick chat with you. Kindly share me your email id. Mine is email@example.com.
Thanks - Sakthi.
Did you get questions involving Hive during Sqoop or in Spark in HiveContext. I mean to use or refer the tables in hive?
Yes, you might get a question to use a hive metastore as source for your problems.
My question is regarding fill in the blanks for Spark questions and exam format;
All spark related questions are fill in the blanks?
People are saying that they are providing you spark template in sh file. Well, do we need to fill in the blanks in this file then copy and paste in scala or pyspark environment to run it?
Also, Could you share questions as much as you remember? Is it similar with Arun’s problems or harder than his problem cases?
@BalaGanesh : Hi, could you send Arun’s blog link please. Also could you please send me your email contact. Thanks
There are no fill-in the blanks type questions. Kindly go through Durga’s intro video on the new exam format. No code snippet will be provided as well. You have to type in all code either by yourselves. Questions will be similar to what Arun has specified in his blog but relatively easier. Sorry, I cannot disclose any questions from the exam.
Hi Bala, could you also please share playlist url u used to clear the exam ?
what was the source data, are they using the same Order, Order Items table or different? Can you send me the jist of questions you remember. firstname.lastname@example.org