Member since
07-17-2018
2
Posts
0
Kudos Received
0
Solutions
07-17-2018
01:17 PM
I want to load streaming data through a spool directory on to HDFS. I have been using the following configuration file for the same. # Describe/configure the source a1.sources.r1.type = spooldir a1.sources.r1.spoolDir = /root/spool a1.sources.r1.fileHeader = true a1.sources.r1.interceptors = timestampInterceptor a1.sources.r1.interceptors.timestampInterceptor.type = timestamp # Describe the sink a1.sinks.k1.type = hdfs a1.sinks.k1.hdfs.path = /user/root/flume/example a1.sinks.k1.hdfs.fileType= DataStream a1.sinks.k1.hdfs.writeFormat= Text a1.sinks.k1.hdfs.rollInterval = 30 # Use a channel which buffers events in memory a1.channels.c1.type = memory a1.channels.c1.capacity = 1000 a1.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a1.sources.r1.channels = c1 a1.sinks.k1.channel = c1 Flume agent command - cd /usr/hdp/current/flume-server/bin ./flume-ng agent --conf conf --conf-file ~/flumelogs.conf --name a1 -Dflume.root.logger=INFO,console I have recently upgraded Hortonworks to version 2.6. These commands worked fine for me with version 2.5 but I got the following error with the new version. ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. Set system property 'log4j2.debug' to show Log4j2 internal initialization logging. Could you please help me a workaround for this issue?
... View more
Labels:
- Labels:
-
Apache Flume