URGENT: Flume - Error in Web Server logs to HDFS




I’m currently working on the beginning of apache flume.
Getting errors while trying to load data to hdfs -

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2018-08-28 01:39:08,861 main ERROR Unable to create file /var/log/flume-ng/flume.log java.io.IOException: Could not create directory /var/log/flume-ng

Have tried doing the same multiple times by deleting all the directories from local and hdfs but is getting stuck at the same point.
Also, no directory is created at hdfs when I’m trying to run the agent for the example.conf as well.

There are multiple errors in the log some of which are attached.

Specifically: https://www.youtube.com/watch?v=Bzwwwx-bgM0&index=139&list=PLf0swTFhTI8rT3ApjBqt338MCO0ZvReFt [5:33]

Any help asap is appreciated.



Please past the conf file that you are running also


wshdfs.conf: Flume configuration

Name the components on this agent

wh.sources = ws
wh.sinks = hd
wh.channels = mem

Describe/configure the source

wh.sources.ws.type = exec
wh.sources.ws.command = tail -F /opt/gen_logs/logs/access.log

Describe the sink – “XXXXXXXXX”-> my username

wh.sinks.hd.type = hdfs
wh.sinks.hd.hdfs.path = hdfs://nn01.itversity.com:8020/user/XXXXXXXXX/flume_demo

Use a channel which buffers events in memory

wh.channels.mem.type = memory
wh.channels.mem.capacity = 1000
wh.channels.mem.transactionCapacity = 100

Bind the source and sink to the channel

wh.sources.ws.channels = mem
wh.sinks.hd.channel = mem


Did you replace this with lab user id?


Yes, I did. And I have tried the whole process shown in the video multiple times as well.



There is a typo in your conf file. While describing the sink, you placed ‘/’ instead of ’.’. I changed it. Now run the flume agent and check the files whether copied to HDFS or not.


Thanks. This issue is resolved now.