Class not found exception Spark Streaming Please correct me for this Error

apache-spark
scala

#1

Hi Team,
I’m currently learning spark streaming.
I have followed the same as shown in the video for kafka integration with spark streaming.
While executing program in cluster i’m getting class not found exception.

Please correct me for this error.
while running my program in spark cluster i’m getting below error as follows:
spark-submit --master local[2] --deploy-mode client --conf spark.ui.port=12901 --class GetStreamingDepartmentTraffic --packages com.typesafe:config:1.3.2,org.apache.spark:spark-streaming-kafka-0-10_2.11:2.3.0 --verbose streamingdemo.jar dev

Adding default property: spark.driver.memory=1g
Parsed arguments:
master local[2]
deployMode client
executorMemory 1G
executorCores 1
totalExecutorCores null
propertiesFile /home/eshwar/work/spark-2.3.0-bin-hadoop2.7/conf/spark-defaults.conf
driverMemory 1g
driverCores null
driverExtraClassPath null
driverExtraLibraryPath null
driverExtraJavaOptions null
supervise false
queue null
numExecutors null
files null
pyFiles null
archives null
mainClass GetStreamingDepartmentTraffic
primaryResource file:/home/eshwar/datastorm/streamingdemo.jar
name GetStreamingDepartmentTraffic
childArgs [dev]
jars null
packages com.typesafe:config:1.3.2,org.apache.spark:spark-streaming-kafka-0-10_2.11:2.3.0
packagesExclusions null
repositories null
verbose true

Spark properties used, including those specified through
–conf and those from the properties file /home/eshwar/work/spark-2.3.0-bin-hadoop2.7/conf/spark-defaults.conf:
(spark.driver.memory,1g)
(spark.ui.port,12901)
(spark.master,spark://ubuntu:7077)

Ivy Default Cache set to: /home/eshwar/.ivy2/cache
The jars for the packages stored in: /home/eshwar/.ivy2/jars
:: loading settings :: url = jar:file:/home/eshwar/work/spark-2.3.0-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.typesafe#config added as a dependency
org.apache.spark#spark-streaming-kafka-0-10_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.typesafe#config;1.3.2 in spark-list
found org.apache.spark#spark-streaming-kafka-0-10_2.11;2.3.0 in spark-list
found org.apache.kafka#kafka-clients;0.10.0.1 in spark-list
found net.jpountz.lz4#lz4;1.3.0 in spark-list
found org.xerial.snappy#snappy-java;1.1.2.6 in spark-list
found org.slf4j#slf4j-api;1.7.21 in spark-list
found org.spark-project.spark#unused;1.0.0 in spark-list
:: resolution report :: resolve 810ms :: artifacts dl 35ms
:: modules in use:
com.typesafe#config;1.3.2 from spark-list in [default]
net.jpountz.lz4#lz4;1.3.0 from spark-list in [default]
org.apache.kafka#kafka-clients;0.10.0.1 from spark-list in [default]
org.apache.spark#spark-streaming-kafka-0-10_2.11;2.3.0 from spark-list in [default]
org.slf4j#slf4j-api;1.7.21 from spark-list in [default]
org.spark-project.spark#unused;1.0.0 from spark-list in [default]
org.xerial.snappy#snappy-java;1.1.2.6 from spark-list in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 7 | 0 | 0 | 0 || 7 | 0 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
0 artifacts copied, 7 already retrieved (0kB/29
018-11-04 22:04:27 WARN NativeCodeLoader:62 -
GetStreamingDepartmentTraffic
Arguments:
dev
Spark config:
(spark.ui.port,12901)
ava.lang.ClassNotFoundException: GetStreamingDepartmentTraffic
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:235)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:836)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-11-04 22:01:33 INFO ShutdownHookManager:54 - Shutdown hook called
2018-11-04 22:01:33 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-2cf2ca2b-0899-48c3-b00e-7a0f86d820bf