Member since
02-08-2016
36
Posts
18
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1488 | 12-14-2017 03:09 PM | |
2445 | 08-03-2016 02:49 PM | |
4552 | 07-26-2016 10:52 AM | |
4398 | 03-07-2016 12:47 PM |
12-14-2017
03:09 PM
Thank you @Matt Andruff for your reply. I resolved the issue. I had another .jar in the /lib directory containing the same code but with another file name. I'm not sure how it does affect the execution of the job. But after removing it every thing works fine, for now at least.
... View more
08-03-2016
02:49 PM
Okay, I found a workaround, I added: -Duser.timezone=GMT which changes the the JVM timezon. The final Flume-ng command will be as following: flume-ng agent --conf-file spool1.properties --name agent1 --conf $FLUME_HOME/conf -Duser.timezone=GMT The needed directory for the oozie coordiantor is now being created.
... View more
07-27-2016
09:17 AM
Thank you @Michael M and @Alexander Bij for your valuable help.
... View more
07-20-2016
11:06 AM
Okay, I installed the NodeManger on the 3 remaining nodes and I have now all the nodes active.
... View more
05-18-2016
03:23 PM
1 Kudo
Have you read the documentation on HBaseStorage from the Apache Pig website? http://pig.apache.org/docs/r0.15.0/func.html#HBaseStorage You should be able to just not specify the column family the the columns definition (e.g. a colon with no text before it). If this doesn't work, the parsing logic probably needs to be updated.
... View more
08-20-2017
07:51 AM
hey,did you solved the issue ..i am getting the same error ..please help me with this hadoop jar newtable-0.0.1-SNAPSHOT.jar newtable.createtable Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at newtable.createtable.main(createtable.java:17) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:234) at org.apache.hadoop.util.RunJar.main(RunJar.java:148) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
... View more
04-08-2016
01:17 PM
Depends what you plan to do. - Aggregation queries and analytical reports then Hive ( simple jdbc connection is supported by BIRT and Pentaho and you can also make servlets with jdbc pools the whole shebang ) - Selecting one record at a time ( like a dashboard that shows the data of one customer Hbase with REST api from javascript might work Hbase with java api from a servlet if you prefer SQL Apache Phoenix is a cool SQL layer on top of HBase https://phoenix.apache.org/ - Interactive reports on thousands to millions of records ( not billions ) Apache Phoenix, it provides some good enhancements on base HBase from a performance perspective for anything that touches more than a row. You can also do Joins aggregations etc. pp. ( If you want HBase but have kerberos setup have a look at Knox its a SSL capable proxy that strips away the Kerberos requirement and replaces it with a normal web authentication setting for the hbase API )
... View more
03-07-2016
12:49 PM
@Neeraj Sabharwal Even in 0.11 it still exist .. see my answer below.
... View more
03-04-2016
10:42 AM
That solved the problem but the Map job stacked, and even after killing it the Yarn container still exists I had to kill it manually. i'll be back to this shortly.
... View more
03-01-2016
01:37 AM
@Zaher Mahdhi I agree, can you please post this as a new question and provide steps to reproduce the problem.
... View more