I’m facing the same error too. I have spent quite some time debugging. My java environment variables have been set up and I have also set up Spark environment variables to .bash_profile and .profile.
I’m running Cygwin as administrator but unable to execute chmod +x sparkshell.cmd
Here are the errors I’m getting:
Error: Could not find or load main class org.apache.spark.launcher.Main
-bash: /home/mc58838/spark/bin/spark-shell.cmd: Permission denied