Exams tips for CCA 175 takers


#84

@dksrinivasa i have one doubt , even if they ask us to run the script ? can we execute it in CLI one by one.
i mean i can run the script, but i am wondering if something goes wrong. can i execute it line by line in pyspark/scala shell as at the end the output will be the same (storing,aggregating etc etc).

i think , there would be two scripts , 1 main .sh script and a child script which will be .py/scala script. we need to fill the .py script and launch the .sh script (./script,sh)which will call the .py script and execute it


#91

Yes you are absolutely right!


#95

please share the link on youtbe for preparation of scala spark CCA175 exam


#96

http://www.itversity.com/lessons/transform-stage-store-using-spark-with-scala/


#108

@email2dgk Great, Thanks!


#117

yes , this is what is true , but be informed that this particular case is changing and recently ppl are having questions on spark without the skeleton , in that case you have to write the code from scratch.


#124

@Avinash_Parida

ok, so no need to create a JAR file for spark!! instead we need to use scala/python shell…

Thanks for the clarification!!


#125

It’s up to you whether you can validate your results using REPL (spark-shell/pyspark) but the goal is to use the given .sh script to do spark-submit


#126

there are 2 scenarios for your question

  1. if they provide skeleton : then you have to fill the skeleton and there would be another .sh file which will be invoking this skeleton , after filling up the skeleton youjust need to run the .sh file from your local.
    for validating you can use scala/python CLI. but for executing the sh file you will do it from your local

2)if they dont provide skeleton: you have to write it from scratch and here you will be executing from your scala/python shell


#132

Does reattempt cost same? Or do they give chances?


#134

Hi Promod,

Thanks for your list of preparation brief. It is really helpful.
Do we need to be proficient in both Scala and Python to cover up the code snippet like questions?

It would be great to know whether we would be writing any spark applications instead of using spark-shell and also how to use “run.sh” in such cases.

Appreciate your response.