Unable to connect to spark-shell in labs

spark-shell

#1

Hi team,

When connecting to spark-shell; i just tried with

spark-shell --master yarn --conf spark.ui.port=12599 --num-executors 4 --executor-cores 2 --executor-memory 2G

please help resolve this…

it just displays
18/02/11 10:42:26 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:27 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:28 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:29 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:30 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:31 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:32 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:33 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:34 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:35 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)
18/02/11 10:42:36 INFO Client: Application report for application_1517228278761_7441 (state: ACCEPTED)


#2

me tooo!


18/02/11 10:46:09 INFO Client:
         client token: N/A
         diagnostics: [Sun Feb 11 10:46:06 -0500 2018] Application is Activated, waiting for resources to be assigned for AM.  Details : AM Partition = <DEFAULT_PARTITION> ; Partition Resource = <memory:172032, vCores:42> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 100.0 % ; Queue's Absolute max capacity = 100.0 % ;


#3

@Rakz_Roamerz uhm does not look great http://gw01.itversity.com:8080/#/main/services/YARN/summary


#4

ZooKeeper Server Stopped 1 alert

http://gw01.itversity.com:8080/#/main/alerts/11

for 2 hours…
I think I’ll switch to my vm for a bit :stuck_out_tongue:


#5

@Rakz_Roamerz working now

spark-shell --master yarn --conf spark.ui.port=   

#6

thanks for checking @supermario


#7

, 47029)
18/02/11 12:32:27 INFO BlockManagerMaster: Removed 1 successfully in removeExecutor
18/02/11 12:32:27 ERROR TransportClient: Failed to send RPC 4954156124599270757 to wn01.itversity.com/172.16.1.102:59470: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
18/02/11 12:32:27 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to get executor loss reason for executor id 1 at RPC address wn01.itversity.com:59479, but got no response. Marking as slave lost.
java.io.IOException: Failed to send RPC 4954156124599270757 to wn01.itversity.com/172.16.1.102:59470: java.nio.channels.ClosedChannelException


#8

@supermario Everything is fixed.Let us know if you are still facing any issues.