Member since
08-20-2013
43
Posts
1
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6721 | 05-21-2014 08:40 AM | |
3135 | 10-07-2013 08:59 AM | |
6288 | 10-07-2013 08:41 AM |
05-21-2014
08:40 AM
Then go to cloudera manager and in configuration for flume agent, under Advanced, you should see "Java Configuration Options for Flume Agent", add jars you need in there like : --classpath /.../.../twitter4j.jar The jar needs to be uploaded to the server of the flume agent.
... View more
05-20-2014
10:02 AM
This error says your twitter4j.jar is not on the classpath for your flume agent. Check your flume agent log, the classpath should be dumped out in the log when you start the agent. Check if twitter4j.jar is in there. If not, you can simple add it in. Are you using cloudera manager?
... View more
02-28-2014
06:44 AM
No experience with connecting hadoop with ZOS DB2, but I don't see why it won't work. DB2 has a JDBC driver, right? So the configuration would be the same for DB2 as for other databases. This IBM website has the details on what parameter values you need to connect to DB2: http://pic.dhe.ibm.com/infocenter/bigins/v2r0/index.jsp?topic=%2Fcom.ibm.swg.im.infosphere.biginsights.import.doc%2Fdoc%2Fdata_warehouse_sqoop.html
... View more
02-05-2014
02:34 PM
I am answering my own question here about the ClassNotFoundException. Since I have updated the jar into the workflow's workspace, when picking the jar for jar and files, the generated Workflow.xml show the path for the jar as relative to the workspace. This does not seem to work. I uploaded the same jar to a different hdfs location when the workspace. Generated the workflow.xml and see that the jar path is fully qualified. This time it worked. But I'd thought the relative path should have worked as well. Anyhow, this is for other who run into the same issue.
... View more
02-05-2014
02:10 PM
Class not found for the class with the driver. I uploaded the jar into the workflow's workspace folder. Then the jar is defined in "jar" field and also added to the Files and Archive. The driver class is definitely in the jar. What a I doing wrong?
... View more
02-04-2014
01:41 PM
I'd like to use hue to configure a oozie workflow which consists a mapreduce job. I have a hard time to figure out where the arguments go. For example, if I am to run the famous wordcount jar with a twist that is having to use a date variable which I will define in an coordinator. $ bin/hadoop jar /usr/joe/wordcount.jar org.myorg.WordCount /usr/joe/wordcount/input/${date} /usr/joe/wordcount/output/${date} From Hue-Oozie, it is only obvious to me where the jar file is defined, but how about: - classname - input - output How do I specify these pieces? I wish there is a oozie workflow video showing how to define a mapreduce action.
... View more
Labels:
02-04-2014
10:44 AM
Thanks for all your replies. I think I will go with a fresh new mysql database, the oozie history isn't that important to me. Just one more question ...... the workflows and coordinators configured via hue ... this info is stored in the Hue database right?
... View more
02-04-2014
07:54 AM
I knew this could come someday and now it is the day .... Oozie is running very slow and I believe it is due to using the embedded derby db. Anyone has experience of migrating the data that is already in derby to an external mysql? What do I loose if I just start with a clean mysql and resubmit all the oozie jobs?
... View more
Labels:
- Labels:
-
Apache Oozie
10-07-2013
11:03 AM
Thanks romain. I'm looking forward to 3.0
... View more
10-07-2013
08:59 AM
Forgot to mention that the updating of timezone does work. So the work around is to leave it as default and go back in to edit the dataset and set the timezone to what you desire. The update does stick.
... View more