Member since
11-18-2014
196
Posts
18
Kudos Received
8
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
8663 | 03-16-2016 05:54 AM | |
3997 | 02-05-2016 04:49 AM | |
2848 | 01-08-2016 06:55 AM | |
16300 | 09-29-2015 01:31 AM | |
1728 | 05-06-2015 01:50 AM |
09-07-2015
07:53 AM
Hello, Until now, we used Flume to transfer data, once a day, from a spool directory to a HDFS sink with a memory channel. Now, we wanted to do it each 5 minutes, but the flume channel is becoming full at the second import (after 10 minutes). --- 2015-09-07 16:08:04,083 INFO org.apache.flume.client.avro.ReliableSpoolingFileEventReader: Last read was never committed - resetting mark position. 2015-09-07 16:08:04,085 WARN org.apache.flume.source.SpoolDirectorySource: The channel is full, and cannot write data now. The source will try again after 4000 milliseconds --- Flume input: 15-20 files each 5 minutes. Each file has 10-600 KB. Flume configuration: Source : spool dir Source maxBlobLength :1610000000 Channel capacity : 100000 (we tried until 1610000000 but there was no change) Channel transaction capacity : 1000 Sink batch size :1000 Sink idle timeout :60 Sink roll interval : 3600 Sink roll size : 63000000 Sink roll count : 0 What should we change in our configuration ? How can we find out if the blocking part is the channel size or the sink writing speed ? Thank you! Alina GHERMAN
... View more
Labels:
- Labels:
-
Apache Flume
-
HDFS
09-02-2015
04:23 AM
Hello, The dfsclusterhealth view of the namenodes is givin me the folowing informations: Total Files And Directories : 64340
Configured Capacity : 1.88 TB
DFS Used : 2.21 TB
Non DFS Used : 120.56 GB
DFS Remaining : 674.13 GB
DFS Used% : 117.57%
DFS Remaining% : 34.96% How can the DFS used be bigger than the configured capacity? What is the difference? Thank you,
... View more
07-01-2015
12:03 AM
Hello, I'm searching for the recommanded configuration for Flume - HDFS Sink when we are using HDFS in HA. In fact, each time that we restart the cluster/ the nodename fails the active nodename changes and flume fails since is asking informations on the standby node. Thank you! Alina
... View more
Labels:
- Labels:
-
Apache Flume
-
HDFS
05-06-2015
01:50 AM
Note: - the job is run with Map Reduce 2 only if I add a new library to the UDF ==> I get this error only when I add an external library to the UDF. The job run with MR1 (and a lot of warnings) when I added the library with register in the pig script. Alina
... View more
05-04-2015
07:33 AM
Hello, I'm trying to do an EvalFunc UDF for PIG and I'm getting the folowing error: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:225)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:275)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:465)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:368)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.run(MRAppMaster.java:1477)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1474)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1407) After searching I found that the mandatory requirements for CDH may not be fulfilled: http://pig.apache.org/docs/r0.12.0/start.html http://stackoverflow.com/questions/21873050/pig-found-interface-org-apache-hadoop-mapreduce-jobcontext-but-class-was-expe En fact, now, I can't see how was this working until now... How is possible to have pig 0.12 with Hadoop 2.5? Thank you!!
... View more
Labels:
04-15-2015
12:27 AM
Hello, I 'm having the same problem: I tried: sudo -u hdfs hadoop fs -ls -R /tmp but this didn't worked... The log files are written in the tmp folder, but hue is not finding them..
... View more
03-27-2015
01:19 AM
I'm using cdh5.3.0 ...
... View more
03-26-2015
09:20 AM
when I click on the Project column, nothing happens, there is no pop-up ...I put an image to show what I tried... Note! I have CDH 5.3. so Hue 3.7.. Also, I'm using Hue in Chrome Version 41.0.2272.101 m Thank you!
... View more
03-26-2015
07:32 AM
Hello, We can create projects in our documents by going to the home icon (my documents) -> my projects-> + But how can we add an existing scripts/or a new script in a different project than default? Thank you!
... View more
03-24-2015
01:33 AM
1 Kudo
I'm having Cloudera 5.3.0 so Hue 3.7. How can I upgrade to Hue 3.8? Can and should I upgrade to Hue 3.8? Thank you!
... View more