Unable to launch pyspark


#1

Hi,

I am unable to launch pyspark and getting below error. however i am able access HDFS. Could you please check and revert me ASAP.

Below is the error message.

SPARK_MAJOR_VERSION is set to 2, using Spark2
Python 2.7.5 (default, Aug 4 2017, 00:39:18)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4041. Attempting port 4042.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4042. Attempting port 4043.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4043. Attempting port 4044.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4044. Attempting port 4045.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4045. Attempting port 4046.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4046. Attempting port 4047.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4047. Attempting port 4048.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4048. Attempting port 4049.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4049. Attempting port 4050.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4050. Attempting port 4051.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4051. Attempting port 4052.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4052. Attempting port 4053.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4053. Attempting port 4054.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4054. Attempting port 4055.
18/08/21 00:39:07 WARN Utils: Service ‘SparkUI’ could not bind on port 4055. Attempting port 4056.
18/08/21 00:39:07 ERROR SparkUI: Failed to bind SparkUI
java.net.BindException: Address already in use: Service ‘SparkUI’ failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service ‘SparkUI’ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:317)
at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235)
at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newConnector$1(JettyUtils.scala:352)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpConnect$1(JettyUtils.scala:379)
at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:382)
at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:382)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2271)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2263)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:382)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:451)
at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:451)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext.(SparkContext.scala:451)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:745)
ERROR:root:Exception while sending command.
Traceback (most recent call last):
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 908, in send_command
response = connection.send_command(command)
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 1067, in send_command
“Error while receiving”, e, proto.ERROR_ON_RECEIVE)
Py4JNetworkError: Error while receiving
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server (127.0.0.1:46409)
Traceback (most recent call last):
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 990, in start
self.socket.connect((self.address, self.port))
File “/usr/lib64/python2.7/socket.py”, line 224, in meth
return getattr(self._sock,name)(*args)
error: [Errno 111] Connection refused
Traceback (most recent call last):
File “/usr/hdp/current/spark2-client/python/pyspark/shell.py”, line 51, in
if conf.get(‘spark.sql.catalogImplementation’, ‘’).lower() == ‘hive’:
File “/usr/hdp/current/spark2-client/python/pyspark/conf.py”, line 187, in get
return self._jconf.get(key, defaultValue)
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 1158, in call
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 906, in send_command
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 854, in _get_connection
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 860, in _create_connection
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 997, in start
py4j.protocol.Py4JNetworkError: An error occurred while trying to connect to the Java server (127.0.0.1:46409)


#2

@Ganesh_Devadiga Can you share the command which your trying?


#3

I tried below commond before launching pyspark.

export SPARK_MAJOR_VERSION=2


#4

Same was working before 2 days, but last 2 days it is not working


#5

can you paste the command which you are trying to launch pyspark