java.net.NoRouteToHostException: No route to host


#1

Hello Support team,
I am unable to start Spark on gw02.itversity.com and get below error.

Pls help asap.

Full stack of Error is as below:

[eshwargunturu@gw02 ~]$ spark-shell --master yarn \

–deploy-mode client
–conf spark.ui.port=12335
–num-executors 1
–executor-memory 2048M
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
18/04/23 19:38:35 INFO SecurityManager: Changing view acls to: eshwargunturu
18/04/23 19:38:35 INFO SecurityManager: Changing modify acls to: eshwargunturu
18/04/23 19:38:35 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(eshwargunturu); users with modify permissions: Set(eshwargunturu)
18/04/23 19:38:35 INFO HttpServer: Starting HTTP Server
18/04/23 19:38:35 INFO Server: jetty-8.y.z-SNAPSHOT
18/04/23 19:38:35 INFO AbstractConnector: Started SocketConnector@0.0.0.0:50506
18/04/23 19:38:35 INFO Utils: Successfully started service ‘HTTP class server’ on port 50506.
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.2
/
/

Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_77)
Type in expressions to have them evaluated.
Type :help for more information.
18/04/23 19:38:37 INFO SparkContext: Running Spark version 1.6.2
18/04/23 19:38:37 INFO SecurityManager: Changing view acls to: eshwargunturu
18/04/23 19:38:37 INFO SecurityManager: Changing modify acls to: eshwargunturu
18/04/23 19:38:37 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(eshwargunturu); users with modify permissions: Set(eshwargunturu)
18/04/23 19:38:38 INFO Utils: Successfully started service ‘sparkDriver’ on port 45852.
18/04/23 19:38:38 INFO Slf4jLogger: Slf4jLogger started
18/04/23 19:38:38 INFO Remoting: Starting remoting
18/04/23 19:38:38 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.16.1.109:56661]
18/04/23 19:38:38 INFO Utils: Successfully started service ‘sparkDriverActorSystem’ on port 56661.
18/04/23 19:38:38 INFO SparkEnv: Registering MapOutputTracker
18/04/23 19:38:38 INFO SparkEnv: Registering BlockManagerMaster
18/04/23 19:38:38 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-ccd900f2-2828-4241-a64a-4864274527db
18/04/23 19:38:38 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
18/04/23 19:38:38 INFO SparkEnv: Registering OutputCommitCoordinator
18/04/23 19:38:38 INFO Server: jetty-8.y.z-SNAPSHOT
18/04/23 19:38:38 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:12335
18/04/23 19:38:38 INFO Utils: Successfully started service ‘SparkUI’ on port 12335.
18/04/23 19:38:38 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://172.16.1.109:12335
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
18/04/23 19:38:39 INFO TimelineClientImpl: Timeline service address: http://rm01.itversity.com:8188/ws/v1/timeline/
18/04/23 19:38:39 INFO RMProxy: Connecting to ResourceManager at rm01.itversity.com/172.16.1.106:8050
18/04/23 19:38:39 INFO AHSProxy: Connecting to Application History server at rm01.itversity.com/172.16.1.106:10200
18/04/23 19:38:40 INFO Client: Requesting a new application from cluster with 7 NodeManagers
18/04/23 19:38:40 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (4096 MB per container)
18/04/23 19:38:40 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
18/04/23 19:38:40 INFO Client: Setting up container launch context for our AM
18/04/23 19:38:40 INFO Client: Setting up the launch environment for our AM container
18/04/23 19:38:40 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://nn01.itversity.com:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
18/04/23 19:38:40 INFO Client: Preparing resources for our AM container
18/04/23 19:38:40 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://nn01.itversity.com:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
18/04/23 19:38:40 INFO Client: Source and destination file systems are the same. Not copying hdfs://nn01.itversity.com:8020/hdp/apps/2.5.0.0-1245/spark/spark-hdp-assembly.jar
18/04/23 19:38:40 INFO Client: Uploading resource file:/tmp/spark-b913a26e-5630-4662-9773-319df6b09280/__spark_conf__2390170741704929430.zip -> hdfs://nn01.itversity.com:8020/user/eshwargunturu/.sparkStaging/application_1524500253262_0266/__spark_conf__2390170741704929430.zip
18/04/23 19:38:40 INFO DFSClient: Exception in createBlockOutputStream
java.net.NoRouteToHostException: No route to host
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1601)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1342)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463)
18/04/23 19:38:40 INFO DFSClient: Abandoning BP-292116404-172.16.1.101-1479167821718:blk_1082419474_8685585
18/04/23 19:38:40 INFO DFSClient: Excluding datanode DatanodeInfoWithStorage[172.16.1.115:50010,DS-cdbb45d6-a19f-4a87-bdac-c5326e93c5ea,DISK]
18/04/23 19:38:40 INFO DFSClient: Exception in createBlockOutputStream
java.net.NoRouteToHostException: No route to host
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1601)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1342)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463)
18/04/23 19:38:40 INFO DFSClient: Abandoning BP-292116404-172.16.1.101-1479167821718:blk_1082419475_8685586
18/04/23 19:38:40 INFO DFSClient: Excluding datanode DatanodeInfoWithStorage[172.16.1.103:50010,DS-1f4edfab-2926-45f9-a37c-ae9d1f542680,DISK]
18/04/23 19:38:40 INFO DFSClient: Exception in createBlockOutputStream
java.net.NoRouteToHostException: No route to host
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1601)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1342)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463)
18/04/23 19:38:40 INFO DFSClient: Abandoning BP-292116404-172.16.1.101-1479167821718:blk_1082419476_8685587
18/04/23 19:38:40 INFO DFSClient: Excluding datanode DatanodeInfoWithStorage[172.16.1.102:50010,DS-1edb1d35-81bf-471b-be04-11d973e2a832,DISK]
18/04/23 19:38:40 INFO DFSClient: Exception in createBlockOutputStream
java.net.NoRouteToHostException: No route to host
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1601)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1342)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463)
18/04/23 19:38:40 INFO DFSClient: Abandoning BP-292116404-172.16.1.101-1479167821718:blk_1082419477_8685588
18/04/23 19:38:40 INFO DFSClient: Excluding datanode DatanodeInfoWithStorage[172.16.1.107:50010,DS-a12c4ae3-3f6a-42fc-83ff-7779a9fc0482,DISK]
18/04/23 19:38:40 WARN DFSClient: DataStreamer Exception
java.io.IOException: Unable to create new block.
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1308)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463)
18/04/23 19:38:40 WARN DFSClient: Could not get block locations. Source file “/user/eshwargunturu/.sparkStaging/application_1524500253262_0266/__spark_conf__2390170741704929430.zip” - Aborting…
18/04/23 19:38:40 INFO Client: Deleting staging directory .sparkStaging/application_1524500253262_0266
18/04/23 19:38:40 ERROR SparkContext: Error initializing SparkContext.
java.net.NoRouteToHostException: No route to host
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1601)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1342)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463)
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
18/04/23 19:38:41 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
18/04/23 19:38:41 INFO SparkUI: Stopped Spark web UI at http://172.16.1.109:12335
18/04/23 19:38:41 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
18/04/23 19:38:41 INFO YarnClientSchedulerBackend: Stopped
18/04/23 19:38:41 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/04/23 19:38:41 INFO MemoryStore: MemoryStore cleared
18/04/23 19:38:41 INFO BlockManager: BlockManager stopped
18/04/23 19:38:41 INFO BlockManagerMaster: BlockManagerMaster stopped
18/04/23 19:38:41 WARN MetricsSystem: Stopping a MetricsSystem that is not running
18/04/23 19:38:41 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/04/23 19:38:41 INFO SparkContext: Successfully stopped SparkContext
18/04/23 19:38:41 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
18/04/23 19:38:41 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
18/04/23 19:38:41 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
java.net.NoRouteToHostException: No route to host
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1601)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1342)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1295)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:463)

java.lang.NullPointerException
at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.(:15)
at $iwC.(:24)
at (:26)
at .(:30)
at .()
at .(:7)
at .()
// ./_,// //_\ version 1.6.2
at $print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
:16: error: not found: value sqlContext
import sqlContext.implicits._
^
:16: error: not found: value sqlContext
import sqlContext.sql
^
scala> sc
:20: error: not found: value sc
sc
^


#2

I’m getting the same issue.


#3

I’m also facing the same issue in the lab


#4

getting the same error… can someone help asap please.


#5

@eshwar.gunturu
Its working fine with me. Try this

spark-shell --master yarn \

–deploy-mode client
–conf spark.ui.port=12335
–num-executors 1
–executor-memory 2048M


#6

it worked fine after an hour


#7