Pyspark not working on Windows cmd

Hello, I’m using Cygwin and following this YouTube link to set up spark and use pyspark on windows cmd. I set all the required environment variables including PATH to Python and Spark home, SPARK_HOME, PYSPARK_PYTHON.

However, since my home directory in Cygwin “C:\cygwin\home\ [firstName lastName]” has a space in Name even after I traversed to the correct path to the Spark directory “C:\cygwin\home\ [firstName lastName]\spark-1.6.3-bin-hadoop2.6” while executing pyspark from bin it throws error - ‘C:\cygwin\home\ [firstName]’ is not recognized as an internal or external command,
operable program or batch file.

Kindly help! I don’t want to change the system name that cygwin picks as a directory name inside /home . Is there a work around? Kindly help!

When i try to check whether pyspark works in cmd, i am getting the below error

Python Version: 3.8.1
Spark Version: spark-2.4.5-bin-hadoop2.7
WinUtils is also present and also environment variables and path is also set.

Please help me to resolve this issue