Error : Cannot run program "python3": error=2,

I am getting Cannot run program “python3”: error=2, though I have imported it. In the dev environment I have spark 2.4.x with hadoop 2.7.

[@gw03 ~]$ export PYSPARK_PYTHON=python3

[@gw03 ~]$ spark2-submit --master yarn --deploy-mode client --conf spark.ui.port=16789 src/main/python/pydemo.py prod

SPARK_MAJOR_VERSION is set to 2, using Spark2
Exception in thread “main” java.io.IOException: Cannot run program “python3”: error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:96)
at org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:906)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.(UNIXProcess.java:248)
at java.lang.ProcessImpl.start(ProcessImpl.java:134)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
… 12 more

Please help.

You have given a wrong path. Please check with the path once & still if you think it might be lab issue. Please share the code which you have tried so that we will reproduce the issue & action it accordingly.

Hi Ramesh,

Can I run, pyspark2 command upon exporting the python3

$ export PYSPARK_PYTHON=python3
[sujitdas@gw03 ~]$ pyspark2 --master yarn --deploy-mode client --conf spark.ui.port=16878 --num-executors 2 --executor-memory 512M
SPARK_MAJOR_VERSION is set to 2, using Spark2
env: python3: No such file or directory

I got the same error. Please help

Please use export PYSPARK_PYTHON=python3.6 instead of export PYSPARK_PYTHON=python3 .