Lab is down for Spark Environment

Seems the lab environment is down…
***// Stuck from last 30 mins, while starting spark shell ***

[sbanerjee@gw03 ~]$ spark2-shell --master yarn --conf spark.ui.port=12654
SPARK_MAJOR_VERSION is set to 2, using Spark2
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

Yes. I have the issue too. Spark session seems to be in accepted state for long time

@Naresh_Raj Issue is fixed, Please try now.