Bigdatacertification 201711_ Cygwin not running commands after installing spark


HI Durga,

My name is sravya, I started following your Spark and python workshop from this wednesday on using student lab access. I started setting up the cygwin and python by following your videos on Linux and python essentials videos but after setting up Spark, cygwin stopped working or running commands. Cygwin not even giving results fro ls command. If possible please look into this issue.



Please try to reinstall Cygwin and let us know. Paste the steps that you followed.


HI Vinod,

Thank you very much for your reply.

I was able to run Cygwin normally by removing spark environmental variables added to .bash_profile file in Cygwin.I am not sure whether I have used the right environmental variables or not. I followed the video on Programming essentials and added following environmental variables into .bash_profile file in cygwin:
export SPARK_HOME=/home/Sriram/spark
export PATH=$path:$SPARK_HOME/bin:$SPARK_HOME/sbin

Please let me know whether these Spark environmental variables have any mistakes in it.


Hi, The path in Cygwin will be something simillar to this /cygdrive/c/cygwin64/home/username/spark


HI Vinod, Thank you very much for your reply. I was able to set spark but while running spark_shell I receiving following issue. "Error: Could not find or load main class org.apache.spark.launcher.Main"
Please find attached screenshot.



Try running spark-shell.cmd


Hi vinod,
I tried running spark-shell.cmd too, even that did not work for me.I am getting the same error as before.image
If you are available for a small meeting on team viewer please let me know your availability.I am not sure where I am messing the path as it says " system cannot find the specified path"


Hi Guys,

I was also facing same kind of issues with cyqwin, so for better practice use cmder : console emulator. Everything working fine for me :


Hi @sravyakalla,

Cygwin is just an simple emulator for Unix in windows, it is not suggestible method or supportable for installing Spark. Better use virtual machines/containers for the same. Or you can go with labs access, where everything is made already installed & accessible for you.