Name node URL and resource manager url not accessible


#1

Hello team,

I’m not able to access name node and resource manager uri. I’m getting “this page cannot be displayed”.

Pls look at this.

Regards,
Vijay


#2

Resource Manager is up

http://rm01.itversity.com:19288/cluster


#3

@Vijaykv

  • Login to Ambari
  • Click on YARN
  • Click on Quick Links
  • Choose Resource Manager UI

image

  • Login to Ambari
  • Click on HDFS
  • Click on Quick Links
  • Choose NameNode UI


#4

I tried the option that you suggested. It still says not accessible.

Regards,
Vijay


#5

@Vijaykv

Provide us the screenshot for better understanding.


#6

Pls find attached the screenshots.

Thanks,
Vijay


#7

@Vijaykv

I can able to access Resource manager and Name Node using your lab credentials.

Click on below links after log in to ambari.

http://rm01.itversity.com:19288/cluster
http://nn01.itversity.com:50070/dfshealth.html#tab-overview


#8

Hi I’m unable to SSH to itversity labs using cygwin. Can you pls kindly look at it ? Below is the error message

$ ssh kvvijaykv@gw01.itversity.com
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@ WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED! @
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY!
Someone could be eavesdropping on you right now (man-in-the-middle attack)!
It is also possible that a host key has just been changed.
The fingerprint for the ED25519 key sent by the remote host is
SHA256:M40CWUJECN7AVCFz8wjfaBNeMh2mwaAP8+3aUjWSZCU.
Please contact your system administrator.
Add correct host key in /home/vikalpat/.ssh/known_hosts to get rid of this messa ge.
Offending ED25519 key in /home/vikalpat/.ssh/known_hosts:2
ED25519 host key for gw01.itversity.com has changed and you have requested stric t checking.
Host key verification failed.


#9

@Vijaykv

please check your gateway and try to login.

https://labs.itversity.com/user/lab


#10

Hi Chandan,

I’m able to use gateway. But I need ssh to copy the files from my window machine to labs for spark streaming.

Regards,
Vijay


#11

@Vijaykv

  1. If you want to copy data from your local machine to labs local file system you can use scp transfer
    In windows by using WinSCP or FileZilla you can copy data.

In Linux/Mac (Cygwin in windows) terminals you can use scp commmand to copy the data from local machine to labs local file system.

scp file username@hostname:
sseashu1@gw02.itversity.com’s password:


#12

I’m able to ssh now. thanks.


#13

Hello team,

While integrating Flume with Spark Streaming, I’m getting the below error message. Can you pls look at this.

org.apache.flume.FlumeException: Unable to load sink type: org.apache.spark.streaming.flume.sink.SparkSink, class: org.apache.spark.streaming.flume.sink.SparkSink
at org.apache.flume.sink.DefaultSinkFactory.getClass(DefaultSinkFactory.java:69) ~[flume-ng-core-1.5.2.2.6.5.0-292.jar:1.5.2.2.6.5.0-292]
at org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:41) ~[flume-ng-core-1.5.2.2.6.5.0-292.jar:1.5.2.2.6.5.0-292]
at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:415) ~[flume-ng-node-1.5.2.2.6.5.0-292.jar:1.5.2.2.6.5.0-292]
at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:103) ~[flume-ng-node-1.5.2.2.6.5.0-292.jar:1.5.2.2.6.5.0-292]
at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140) [flume-ng-node-1.5.2.2.6.5.0-292.jar:1.5.2.2.6.5.0-292]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_151]
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_151]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_151]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_151]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_151]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_151]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_151]
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.flume.sink.SparkSink


#14

The below files are not present in cd usr/hdp/2.6.5.0-292./flume/lib.

spark-streaming-flume_2.10-1.6.3.jar
commons-lang3-3.5.jar


#15

@Vijaykv Jars are available now.


#16