Unable to run program in pyspark

#1

Hi @itversity

I am unable to run pyspark program and it gives following errors:

spark-submit --master local saveFile.py
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
16/12/12 23:04:57 INFO SparkContext: Running Spark version 1.6.2
16/12/12 23:04:58 INFO SecurityManager: Changing view acls to: rajsharmaplus
16/12/12 23:04:58 INFO SecurityManager: Changing modify acls to: rajsharmaplus
16/12/12 23:04:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(rajsharmaplus); users with modify permissions: Set(rajsharmaplus)
16/12/12 23:04:58 INFO Utils: Successfully started service ‘sparkDriver’ on port 51428.
16/12/12 23:04:58 INFO Slf4jLogger: Slf4jLogger started
16/12/12 23:04:58 INFO Remoting: Starting remoting
16/12/12 23:04:58 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.16.1.100:46009]
16/12/12 23:04:58 INFO Utils: Successfully started service ‘sparkDriverActorSystem’ on port 46009.
16/12/12 23:04:58 INFO SparkEnv: Registering MapOutputTracker
16/12/12 23:04:58 INFO SparkEnv: Registering BlockManagerMaster
16/12/12 23:04:58 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-3f85e190-bd91-42b3-b6e9-4bb8065caa60
16/12/12 23:04:58 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
16/12/12 23:04:59 INFO SparkEnv: Registering OutputCommitCoordinator
16/12/12 23:04:59 INFO Server: jetty-8.y.z-SNAPSHOT
16/12/12 23:04:59 WARN AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2024)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2015)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:137)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:481)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
at py4j.Gateway.invoke(Gateway.java:214)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
at py4j.GatewayConnection.run(GatewayConnection.java:209)
at java.lang.Thread.run(Thread.java:745)
16/12/12 23:04:59 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@6532eb2a: java.net.BindException: Address already in use
java.net.BindException: Address already in use

Can you please guide to overcome this?

Regards, Raj

0 Likes

#2

@raj_sharma Have tried this answer

0 Likes

#3

@raj_sharma You can try submitting job by running following command:

spark-submit --master local --conf "spark.ui.port=10111"  saveFile.py
//here inplace of 10111 you can use any valid port
0 Likes

Unable to launch spark-shell
#4

Perfect…:slight_smile:

Thanks.

0 Likes

closed #5
0 Likes