Support Questions

Find answers, ask questions, and share your expertise

Trying to run simple Java Hbase program w Eclipse

avatar
Contributor

I'm able to run simple 'create' and 'put' to add data to rows (I'm using HBase - the Definitive Guide) but it's failing when I try to run the Java program shown below:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.util.Bytes;
import java.io.IOException;

public class PutExample {
   public static void main(String[] args) throws IOException {
      Configuration conf = HBaseConfiguration.create();
      HTable table = new HTable(conf, "jptesttable");
      Put put = new Put(Bytes.toBytes("row1"));
      put.add(Bytes.toBytes("colfam1"),Bytes.toBytes("qual1"),Bytes.toBytes("val1"));
      put.add(Bytes.toBytes("colfam1"),Bytes.toBytes("qual2"),Bytes.toBytes("val2"));

      table.put(put);
      table.close();
   }
}

 I'm pretty sure I have all the libraries I need (plus some I don't need):

    hbase.jar

    commons-logging-1.1.1.jar

    log4j-1.2.17.jar

    zookeeper.jar

    commons-lang-2.5.jar

    commons-configuration-1.6.jar

    slf4j-api-1.6.1.jar

    slf4j-log4j12-1.6.1.jar

    hadoop.core.jar

    core.3.1.1.jar

    hadoop.common-2.0.0-cdh4.1.1.jar

    hbase-0.92.1-cdh4.1.1-security.jar

and 

    usr/lib/hbase/conf   in the path

I haven't changed the hbase.site.xml file:

   

<configuration>
  <!-- Changing the default port for REST since it conflicts with yarn nodemanager  -->
  <property>
    <name>hbase.rest.port</name>
    <value>8070</value>
    <description>The port for the HBase REST server.</description>
  </property>
  <property>
    <name>hbase.rootdir</name>
    <value>hdfs://localhost:8020/hbase</value>
  </property>
</configuration>

 => Is there anything I can do to get this to work?

 

Here's the error log, 

Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/collect/Maps
at org.apache.hadoop.metrics2.lib.MetricsRegistry.<init>(MetricsRegistry.java:42)
at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:87)
at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:133)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)
at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:97)
at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:190)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37)
at org.apache.hadoop.hbase.security.User.call(User.java:586)
at org.apache.hadoop.hbase.security.User.callStatic(User.java:576)
at org.apache.hadoop.hbase.security.User.access$400(User.java:50)
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:393)
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:388)
at org.apache.hadoop.hbase.security.User.getCurrent(User.java:139)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionKey.<init>(HConnectionManager.java:412)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:182)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:196)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:173)
at PutExample.main(PutExample.java:17)
Caused by: java.lang.ClassNotFoundException: com.google.common.collect.Maps
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 23 more

 

thanks,

jp

1 ACCEPTED SOLUTION

avatar
Contributor

Was able to run this task:

[training@localhost java]$ javac -cp `hbase classpath` MakeTable.java
[training@localhost java]$ java -cp `hbase classpath` MakeTable

 Thanks for your help.

jp

View solution in original post

7 REPLIES 7

avatar
Guru

Try placing the following on the command line before your "hadoop jar...." command:

 

HADOOP_CLASSPATH=`hbase classpath`

 

Which brings up a good point, you ARE running your program with the syntax "hadoop jar /path/to/your/jar className..." right?

avatar
Contributor
I'm trying to run it through the IDE (Eclipse) - just running the program in Debug mode.
In my search for answers, I'm only seeing questions from people running from the command line.
Can I test from the Eclipse IDE, or will my testing have to be from jar files at the command line?

avatar
Guru

Oh ok, thanks for clarifying.  Yes, you can most certainly develop and test apps in Ecplise.  In fact, there is a blog post and video tutorial on that exact process posted in this thread.  I bet that'll get you the help you need, but please let us know if not.

 

Regards.

avatar
Contributor

Thanks.  I used the excellent video and blog (and my Cloudera Academic Training notes) to run Hadoop MR scenarios as jar files and through Eclipse.  Very happy.

I haven't been successful in running the hbase java program I listed earlier, or the MakeTable example included in the hbase directory.  

Configuration issues?  (I'm using the out-of-the-box configuration.) 

 

[training@localhost java]$ javac -classpath `hbase classpath` MakeTable.java
[training@localhost java]$ jar cvf MakeTable.jar MakeTable.class
[training@localhost java]$ hadoop jar MakeTable.jar MakeTable

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

at MakeTable.main(MakeTable.java:24)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 6 more

 

jp

avatar
Contributor

Was able to run this task:

[training@localhost java]$ javac -cp `hbase classpath` MakeTable.java
[training@localhost java]$ java -cp `hbase classpath` MakeTable

 Thanks for your help.

jp

avatar
Guru

Thanks for the follow up.  I was going to suggest that exact thing for command-line, but the part I wasn't clear on was how to get Eclipse to load the HBase jars both during compile time AND runtime.  I'm not super familiar with IDE-based development, but that is the key to getting HBase applications to run.

 

Glad you resolved it!

avatar
Contributor
For anybody who might be following this, I referenced all the libraries referenced in the video and for HBase , added the 2 HBase refs to the reference list (hbase.jar and hbase-0.92.1-cdh4.1.1-security.jar)
Then it worked in Eclipse.
-- jp