Cannot access spark shell by spark-shell command


#1

Hi, teaching team
I suddenly cannot access to spark shell by spark-shell command
below is the error message I got

================================================================================
19/02/01 15:27:28 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@50672905: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2040)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2031)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:137)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:481)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:15)
at $line3.$read$$iwC.(:24)
at $line3.$read.(:26)
at $line3.$read$.(:30)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
19/02/01 15:27:28 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
19/02/01 15:27:28 WARN Utils: Service ‘SparkUI’ could not bind on port 4046. Attempting port 4047.

================================================================================
could you help me to solve this issue?
Thank you.
chun-yen


#2

@1114 Launch the spark-shell with port number to avoid this sort of issue like below

spark-shell --conf spark.ui.port=<FIVE_DIGIT_PORT>
spark-shell --conf spark.ui.port=22562