Flume to Spark Streaming Integration error - Unable to load sink type: org.apache.spark.streaming.flume.sink.SparkSink



Hi I am going through udemy course CCA 175 scala
As per the below resource for flume and sparkstreaming integration, i have prepared configuration file exactly as in the below link in the labs (gw03 by itversity)

But I am getting the below Error/Exception. I checked the jar files in the path as demonstrated in the video and they all are present. Could you please help in fixing this.

19:34:54.800 [conf-file-poller-0] ERROR org.apache.flume.node.PollingPropertiesFileConfigurationProvider - Failed to load configuration data. Exception follows.
org.apache.flume.FlumeException: Unable to load sink type: org.apache.spark.streaming.flume.sink.SparkSink, class: org.apache.spark.streaming.flume.sink.SparkSink
_ at org.apache.flume.sink.DefaultSinkFactory.getClass(DefaultSinkFactory.java:69) ~[flume-ng-core-]_
_ at org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:41) ~[flume-ng-core-]_
_ at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:415) ~[flume-ng-node-]_


Could someone please resolve this issue



Can you please share the .conf file as well?


Hi Sunil… i am not able to login to lab (gwo3) from morning… it says resource temporarily unavailable … i wil paste the conf file content as soon as that login issue is fixed