Getting error import mysql.connector, while launching a pyspark2 on linux terminal

Hi,
please resolve this error, while I am trying to launching pyspark2 using linux terminal. It throwing an error like below:

Command used in linux terminal $ /bin/pyspark2 --master yarn

Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark2 will be picked by default
Fatal Python error: Py_Initialize: can’t initialize sys standard streams
Traceback (most recent call last):
File “/usr/lib64/python3.6/io.py”, line 52, in
File “/home/userno/abc.py”, line 3, in
ModuleNotFoundError: No module named ‘mysql’

What can I do, If i want use pyspak in shell scripting?


Learn Spark 1.6.x or Spark 2.x on our state of the art big data labs

  • Click here for access to state of the art 13 node Hadoop and Spark Cluster