I’m trying to configure my environment for spark and python in an OS Ubuntu 14.04 Desktop. I have already running Spark 1.6.2 with a python version 3.5. My next step ought to be create pythons scripts with SparkContext in order to play with RDD ant etc, etc.
My problem is basically that I have spend long time along try to setup the pyspark thru a broad commands sets in Ubuntu.
- I have try to install firstly the pip3 with the command:
curl -O https://bootstrap.pypa.io/get-pip.pypython35 get-pip.py
However I always get errors because my VM is not able to reach out the server or the package. In addition , I have to try to install the pip with : sudo apt-get install pip -y and apparently it worked but now when I run my:
pip install pyspark - I realize this is just for Spark version 2.0 then I really dont know how to setup this final part.
I appreciate your support as always team, thanks.