Member since
08-02-2015
3
Posts
0
Kudos Received
0
Solutions
10-28-2015
12:05 AM
Yes, I saw there but I did't get any extension with .complted files in that folder
... View more
10-27-2015
03:35 AM
Hi team, I have created spool directory with these properties but I did't got any ouput form this confguration. So can any one give me idea about this in cloudera manager. Here, agent1 in one machine and agent2 in annother machine. First I started agent2 after that I need to fetch data in spool directory in agenet1. In Machine2 Configuration file: -------------------------------------- ### Agent2 - Avro Source and File Channel, Avro Sink # Name the components on this agent Agent2.sources = avro-source Agent2.channels = file-channel Agent2.sinks = hdfs-sink ### # Describe/configure Source Agent2.sources.avro-source.type = avro Agent2.sources.avro-source.hostname = 192.168.1.206 Agent2.sources.avro-source.port = 7182 # Describe the sink Agent2.sinks.hdfs-sink.type = hdfs Agent2.sinks.hdfs-sink.hdfs.path = hdfs://192.168.1.201:8020/user/flume/ Agent2.sinks.hdfs-sink.hdfs.rollInterval = 0 Agent2.sinks.hdfs-sink.hdfs.rollSize = 0 Agent2.sinks.hdfs-sink.hdfs.rollCount = 10000 Agent2.sinks.hdfs-sink.hdfs.fileType = DataStream #Use a channel which buffers events in file Agent2.channels.file-channel.type = file Agent2.channels.file-channel.checkpointDir = /home/hduser/Desktop/testflume/checkpoint Agent2.channels.file-channel.dataDirs = /home/hduser/Desktop/testflume/data/ # Bind the source and sink to the channel Agent2.sources.avro-source.channels = file-channel Agent2.sinks.hdfs-sink.channel = file-channel In Machine1 Configuration file: ------------------------------------ ### Agent1 - Spooling Directory Source and File Channel, Avro Sink # Name the components on this agent Agent1.sources = spooldir-source Agent1.channels = file-channel Agent1.sinks = avro-sink ### # Describe/configure Source Agent1.sources.spooldir-source.type = spooldir Agent1.sources.spooldir-source.spoolDir = /home/hduser/Desktop/testflume/spooldir # Describe the sink Agent1.sinks.avro-sink.type = avro Agent1.sinks.avro-sink.hostname = 192.168.1.206 Agent1.sinks.avro-sink.port = 7182 #IP Address masked here #Use a channel which buffers events in file Agent1.channels.file-channel.type = file Agent1.channels.file-channel.checkpointDir = /home/hduser/Desktop/testflume/checkpoint Agent1.channels.file-channel.dataDirs = /home/hduser/Desktop/testflume/data/ # Bind the source and sink to the channel Agent1.sources.spooldir-source.channels = file-channel Agent1.sinks.avro-sink.channel = file-channel
... View more
08-02-2015
10:36 PM
Actually I have imported data from Oracle 11g to Hdfs through terminal successfully but from Hue Web UI it showing oracle to HDFS Error: Could not start job. Where I have given these credentials Schema name : Table name : Share_osc Table SQL statement : Table column names : Partition column name : Null value allowed for the partition column : Boundary query : Extractors : Loaders : Override null value : Null value : Output format : Text File Compression forma : none Custom compression format : Output directory : /tmp/test1 Can any one will give the solution form this.???
... View more