SOLVED [Error 5] Access is denied when importing SparkConf, SparkContext

pyspark
apache-spark

#1

I’m trying to run the query below in pycharm

from pyspark import SparkConf, SparkContext

sc = SparkContext(master=“local”, appName=“Spark Demo”)
print(sc.textFile(“C:\deckofcards.txt”).first())

I receive the following error:

C:\Python27\python.exe C:/Users/jenne/PycharmProjects/gettingstarted/sparkdemo.py
Traceback (most recent call last):
File “C:/Users/jenne/PycharmProjects/gettingstarted/sparkdemo.py”, line 3, in
sc = SparkContext(master=“local”, appName=“Spark Demo”)
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\context.py”, line 112, in init
SparkContext._ensure_initialized(self, gateway=gateway)
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\context.py”, line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File “C:\spark-1.6.3-bin-hadoop2.6\python\pyspark\java_gateway.py”, line 79, in launch_gateway
proc = Popen(command, stdin=PIPE, env=env)
File “C:\Python27\lib\subprocess.py”, line 390, in init
errread, errwrite)
File “C:\Python27\lib\subprocess.py”, line 640, in _execute_child
startupinfo)
WindowsError: [Error 5] Access is denied

Process finished with exit code 1

Please help


#2

i am also getting similar error ! although the post is pretty old but any help around this would be greatly appreciated !


#3

how was this Solved, could you please let me know


#4

Hi @Sameer_Rao

The issue could be with the permissions in the C directory. Can you paste the file in D directory and execute the code.

Regards,
Sunil Itversity


#5

Thanks Sunil !! it was an issue with the permissions of spark binaries…

After changing the permissions ,it worked like a charm :slight_smile:
samee@DESKTOP-S1P27Q2 /cygdrive/d/spark/spark-2.2.1-bin-hadoop2.7/bin
$ ls -lartr
total 104
-rwxrw-r–+ 1 samee samee 1155 Nov 25 2017 spark-submit2.cmd
-rwxrw-r–+ 1 samee samee 1035 Nov 25 2017 spark-submit.cmd
-rwxrwxr-x+ 1 samee samee 1040 Nov 25 2017 spark-submit
-rwxrwxr-x+ 1 samee samee 1065 Nov 25 2017 spark-sql
-rwxrw-r–+ 1 samee samee 1631 Nov 25 2017 spark-shell2.cmd
-rwxrw-r–+ 1 samee samee 1033 Nov 25 2017 spark-shell.cmd
-rwxrwxr-x+ 1 samee samee 3017 Nov 25 2017 spark-shell
-rwxrw-r–+ 1 samee samee 1049 Nov 25 2017 sparkR2.cmd
-rwxrw-r–+ 1 samee samee 1023 Nov 25 2017 sparkR.cmd
-rwxrwxr-x+ 1 samee samee 1039 Nov 25 2017 sparkR
-rwxrw-r–+ 1 samee samee 2545 Nov 25 2017 spark-class2.cmd
-rwxrw-r–+ 1 samee samee 1035 Nov 25 2017 spark-class.cmd
-rwxrwxr-x+ 1 samee samee 3196 Nov 25 2017 spark-class
-rwxrw-r–+ 1 samee samee 1076 Nov 25 2017 run-example.cmd
-rwxrwxr-x+ 1 samee samee 1030 Nov 25 2017 run-example
-rwxrw-r–+ 1 samee samee 1540 Nov 25 2017 pyspark2.cmd
-rwxrw-r–+ 1 samee samee 1025 Nov 25 2017 pyspark.cmd
-rwxrwxrwx+ 1 samee samee 2989 Nov 25 2017 pyspark
-rwxrw-r–+ 1 samee samee 2133 Nov 25 2017 load-spark-env.sh
-rwxrw-r–+ 1 samee samee 1968 Nov 25 2017 load-spark-env.cmd
-rwxrw-r–+ 1 samee samee 2681 Nov 25 2017 find-spark-home.cmd
-rwxrwxr-x+ 1 samee samee 1933 Nov 25 2017 find-spark-home
-rwxrw-r–+ 1 samee samee 919 Nov 25 2017 beeline.cmd
-rwxrwxr-x+ 1 samee samee 1089 Nov 25 2017 beeline
drwxrwxr-x+ 1 samee samee 0 Nov 25 2017 …
drwxrwxr-x+ 1 samee samee 0 Nov 25 2017 .

samee@DESKTOP-S1P27Q2 /cygdrive/d/spark/spark-2.2.1-bin-hadoop2.7/bin
$ chmod 777 *

samee@DESKTOP-S1P27Q2 /cygdrive/d/spark/spark-2.2.1-bin-hadoop2.7/bin
$


#6