Initializing spark shell issue


#1

Team,
I have been running the below command but it is taking longer time than usual and job is getting failed,
Could you please look into this issues?

spark-shell --master yarn-client
–conf spark.ui.port=22322
–num-executors 1
–executor-memory 512 M

Thanks


#2


#3

@Z1806027 Use command like below and check again.Let us know the status.

spark-shell --conf spark.ui.port=22344 spark.port.maxRetries=100 --master yarn-client --num-executors 1 --executor-memory 512 M


#4

Nope. It is still taking longer time and throwing the same error.


#5

Please let me know if you have an update to me. so that i can proceed further.
Thanks for your time and help.


#6

@Z1806027:

I just executed with below commands and works fine. Check my execution time below:

######################
spark-shell --master yarn-client
–-conf spark.ui.port=22322
–-num-executors 1
–-executor-memory 512 M
######################

17/12/26 11:15:47 INFO SparkILoop: Created sql context (with Hive support)…
SQL context available as sqlContext.

scala> sc
res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext@602b7944

scala> sqlContext
res1: org.apache.spark.sql.SQLContext = org.apache.spark.sql.hive.HiveContext@25ea3a5f

Thanks
Venkat


#7

Thanks, the command executed well.but when i checked the url it shows the wrong number of executors


#8

do you have any update to me?


#9

@Z1806027:
Its interesting. Can you/someone redirect this question to Durga?
Thanks


#10

@dgadiraju @viswanath.raju (Not sure which is your id, so i tagging both)
could you please look into this issue?
I have been looking for an answer.