Stopping Spark Context and Relaunching


I am taking the HDPCD-Spark course from Udemy. In Section4-lecture 35, towards the end, there is an explanation of how to stop the spark context and relaunch it by importing SparkConf, SparkContext and creating objects. However it does not show how to override parameters like num-executors, executor-memory etc.

Also how is it different from using :q to stop spark context and relaunch it using spark-shell command with the required properties?