Unable to integrate pycharm and Spark on Windows 10

Hi Team,

I am trying to integrate pycharm with Spark as per the instructions in lscture 15 in Section 1 of CCA 175 - Spark and Hadoop developer - Python(pyspark).

Please find the error screenshots as below:

and error: “Import Error: No module named pyspark”

Could you please help me with this issue.


SparkConf and SparkContext imports must be in camel case
change the imports to
from pyspark import SparkConf, SparkContext

Follow the steps from the below blog to Develop pyspark program using Pycharm on Windows 10