Flume ".conf " file error



As far as my knowledge , I did same as durga sir explained in the Video regarding conf File.
Before running the spark-submit for flume , When i trigger the .conf file which is poping the below error.

Kindly,Any one look into this show stopper.


    ... 15 more

18/06/08 09:29:53 INFO sink.SparkSink: Starting Spark Sink: spark on port: 8123 and interface: gw03.itversity.com with pool size: 10 and transaction timeout: 60.
18/06/08 09:29:53 ERROR lifecycle.LifecycleSupervisor: Unable to start SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@6959a2dc counterGroup:{ name:null counters:{} } } - Exception follows.
org.jboss.netty.channel.ChannelException: Failed to bind to: gw03.itversity.com/
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:297)


Can you paste the video link here, it would be helpful to solve your issue


Hi Sunil , thanks again for the quick response.

Here is the video link of ‘.conf’ for flume explanation with NO audio.

Link for flume ‘.conf’ explanation

Soon after running .conf flume I should find data feed going to HDFS and SPARK. as these are my sinks . But I am ending with an error I attached error.log file seperately.


Kindly , Some one trouble shoot the error and help me out and also let me know where i went wrong.
Thank you


Replace the bind part with.

ag.sources.src.channels = hdmem sparkmem

but you have given it as

ag.sources.src.channels = hd spark

start the flume agent it will put the data into hdfs as well as spark sink

Sunil Abhishek