Unable to connect to pyspark

pyspark

#1

Hi team,

Getting this error again and it is been 1 day and is showing the same error when i tired to login to pyspark.

Looks like whenever the internet connection goes off lab also gets terminated. Can you please check revert back asap as it is wasting my and blocking my access

[ganeshlaxman@gw03 ~]$ export SPARK_MAJOR_VERSION=2
[ganeshlaxman@gw03 ~]$ pyspark
SPARK_MAJOR_VERSION is set to 2, using Spark2
Python 2.7.5 (default, Aug 4 2017, 00:39:18)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-16)] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4041. Attempting port 4042.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4042. Attempting port 4043.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4043. Attempting port 4044.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4044. Attempting port 4045.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4045. Attempting port 4046.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4046. Attempting port 4047.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4047. Attempting port 4048.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4048. Attempting port 4049.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4049. Attempting port 4050.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4050. Attempting port 4051.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4051. Attempting port 4052.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4052. Attempting port 4053.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4053. Attempting port 4054.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4054. Attempting port 4055.
18/08/31 05:32:34 WARN Utils: Service ‘SparkUI’ could not bind on port 4055. Attempting port 4056.
18/08/31 05:32:34 ERROR SparkUI: Failed to bind SparkUI
java.net.BindException: Address already in use: Service ‘SparkUI’ failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service ‘SparkUI’ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:317)
at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235)
at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newConnector$1(JettyUtils.scala:352)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpConnect$1(JettyUtils.scala:379)
at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:382)
at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:382)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2271)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2263)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:382)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:451)
at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:451)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext.(SparkContext.scala:451)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:745)
ERROR:root:Exception while sending command.
Traceback (most recent call last):
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 908, in send_command
response = connection.send_command(command)
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 1067, in send_command
“Error while receiving”, e, proto.ERROR_ON_RECEIVE)
Py4JNetworkError: Error while receiving
ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server (127.0.0.1:39989)
Traceback (most recent call last):
File “/usr/hdp/current/spark2-client/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py”, line 990, in start
self.socket.connect((self.address, self.port))
File “/usr/lib64/python2.7/socket.py”, line 224, in meth
return getattr(self._sock,name)(*args)


#2

@Ganesh_Devadiga launch pyspark as below command:

pyspark --master yarn --conf spark.ui.port=12888


#3

Thanks, it is working


#4