Not able to Launch spark-shell


#1

not able to launch SPARK by below both the code and facing below issue:

spark-shell

OR

spark-shell --packages com.databricks:spark-avro_2.10:2.0.1

sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
18/09/17 02:17:38 INFO DiskBlockManager: Shutdown hook called
18/09/17 02:17:38 INFO ShutdownHookManager: Shutdown hook called
18/09/17 02:17:38 INFO ShutdownHookManager: Deleting directory /tmp/spark-0697af25-fc5d-4334-a055-a6d9ad32c29d
18/09/17 02:17:38 INFO ShutdownHookManager: Deleting directory /tmp/spark-d2b2c4b6-2218-4106-aa1a-886a099323b0
18/09/17 02:17:38 INFO ShutdownHookManager: Deleting directory /tmp/spark-d2b2c4b6-2218-4106-aa1a-886a099323b0/userFiles-605752d6-8fc1-43ce-8187-391345b76781


#2

@shubhaprasadsamal Try below command to launch spark-shell:

spark-shell --packages com.databricks:spark-avro_2.10:2.0.1 --conf spark.ui.port=15847 spark.port.maxRetries=100


#3