Running spark scripts

Hello Friends ,
I am still confused regarding the python/scala scripts that will be given.

  1. so are we going to execute a .py/.scala scripts or execute it in CLI one by one
    2)even if they ask us to run the script ? can we execute it in CLI one by one.
    i mean i can run the script, but i am wondering if something goes wrong. can i execute it line by line in pyspark/scala shell as at the end the output will be the same (storing,aggregating etc etc)as it would have been while running the script.
    3)do we need to create the SC in the script to run or they will be providing all this configuration as i heard it will be skeleton of codes with codes to fillup .

4)if it is script then , spark-submit --master local will execute the .py script , so should we run it in local mode or yarn mode.

5)or is it that there would be two scripts , 1 main .sh script and a child script which will be .py/scala script. we need to fill the .py script and launch the .sh script (./script,sh)which will call the .py script and execute it without doing all those configuration things and setting SC etc.

please guys i need help in understanding how the script is going to be and how will we be executing .

@pramodvspk @hemprasadk : hello guys since you have cleared the certification , can you please help me regarding this.
thanks in advance .

@Avinash_Parida I am also having ame confusion. Pls all the guys who cleared certification elaborate on this. Few guys mentioned they could not cleared certification due this .sh problem. I know how to execute .sh file but as mentioned by Avinash if we need to call .py file from it or as in the the Durga sir’s video there will be file with python, scala code to fill in .sh file which is like EOF.

@N_Chakote ,@Avinash_Parida, which youtube video talks about running the .scala/.py files ? Are you talking about running the SBT package to create the JAR file and then spark-submit to run the job ?
That video doesn’t have any info on .sh file. I feel , I’m missing out a big part in Spark preparation .

check this: