Python3.6 - Pyspark installation

Hi All,

I am getting the below error in pyspark installation on python 3.6.
Can you please help.

[kushaluit@gw02 ~]$ python3.6 -m pip install pyspark
Processing ./.cache/pip/wheels/ab/09/4d/0d184230058e654eb1b04467dbc1292f00eaa186544604b471/pyspark-2.4.4-py2.py3-none-any.whl
Collecting py4j==0.10.7
Using cached
Installing collected packages: py4j, pyspark
ERROR: Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: ‘/usr/lib/python3.6/site-packages/py4j-0.10.7.dist-info’
Consider using the --user option or check the permissions.

Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster

The permissions to other is for executions, and the owner in the root.
[bielolopez@gw02 ~]$ ls -altr /usr/lib | grep python
drwxr-xr-x 3 root root 4096 Jun 5 2017 python2.6
drwxr-xr-x 3 root root 4096 Aug 8 2017 python3.5
drwxr-xr-x 3 root root 4096 Aug 5 2018 python3.6
drwxr-xr-x 3 root root 4096 Aug 6 20:51 python2.7
drwxr-xr-x 3 root root 4096 Nov 1 06:01 python3.4

As the architecture is raised I see that you cannot modify these permissions. Only ask for root access.

You can run but not install.

One way but you still need to be root, it would be:

[bielolopez @ gw02 ~] $ export PYSPARK_PYTHON = / usr / lib / python3.4
[bielolopez @ gw02 ~] $ pyspark --master yarn --conf spark.ui.port = 12890 --num-executors 2 --executor-memory 51
SPARK_MAJOR_VERSION is set to 2, using Spark2
env: /usr/lib/python3.4: Permission denied

But as root permissions are needed.
If you were on a server and they give you such permissions you can do so by exporting the pyspark address to use python 3.x

I send you this link, but I would do it in a virtual machine. To not modify the working environment of the access I have.