Support Questions

Find answers, ask questions, and share your expertise

Getting HConnection as null , Failed to create local dir /data0/hadoop/hbase/local/jars, DynamicClassLoader failed to init

avatar
Contributor

I am trying to connect to HBase server running on different server from a linux client and i get below error . The code works fine from my windows laptop and I am able to connect to Hbase server and get results. I think I am missing some dependencies jar for my linux server ibecause when i added hbase-client jar it worked from my laptop, which indicates my code logic is correct. All of configuration is being picked up correctly as I have verified it from my laptop. Please provide some suggestion. I am passing hbase-site.xml,core-site.xml,hdfs-site.xml in my resources . My port and zookeeper qurom is correct. My kerberose code works fine. I don't understand if this can be permission issue too. i don't understand why this happening and when does it happen.

Any help or suggestion is much appreciated

Code : connection is returned as null 😞

this.conf =HBaseConfiguration.create();
this.conf.set("hbase.zookeeper.quorum", zookeeperQuorum);
this.conf.set("hbase.zookeeper.property.clientPort", port);
this.conf.set("zookeeper.znode.parent","/hbase-secure");
//  this.conf.set("hbase.client.retries.number", Integer.toString(35));
//  this.conf.set("zookeeper.session.timeout", Integer.toString(20000));
//this.conf.set("zookeeper.recovery.retry", Integer.toString(1));
this.conf.set("hadoop.security.authentication","kerberos");
this.conf.set("hbase.security.authentication","kerberos");
this.conf.set("hbase.master.kerberos.principal", userName);this.conf.set("user.name",userName)

;try{this.connection =HConnectionManager.createConnection(conf);}catch(IOException e){// TODO Auto-generated catch block
            e.printStackTrace();}

pom.xml :<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 <a href="http://maven.apache.org">http://maven.apache.org</a> /xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.msoa.hbase.client</groupId>
<artifactId>simpleHBase</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>HbaseWrite</name>
<url>http://maven.apache.org</url>

<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration><archive>
<manifest>
<mainClass>simpleHBase.actionClass</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs></configuration>
</plugin></plugins></build>

<!--  added for dev box  -->
<repositories><repository><id>repo.hortonworks.com</id><name>Hortonworks HDP MavenRepository</name><url>http://repo.hortonworks.com/content/repositories/releases/</url></repository></repositories><!--  end dev box -->

<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>

<dependency>
<groupId>jdk.tools</groupId><artifactId>jdk.tools</artifactId><scope>system</scope><version>1.7.0_60</version>
<systemPath>C:\Program Files\Java\jdk1.7.0_60\lib\tools.jar</systemPath></dependency>

<!--   adding to test on beam -->
<dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-common</artifactId><version>2.2.0</version></dependency>

<dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-hdfs</artifactId><version>2.2.0</version></dependency>

<dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-client</artifactId><version>2.2.0</version></dependency>

<!--  add protocol for beam test-->
<dependency><groupId>org.apache.hbase</groupId><artifactId>hbase-protocol</artifactId><version>0.98.0-hadoop2</version></dependency>

<dependency><groupId>org.apache.hbase</groupId><artifactId>hbase-client</artifactId><version>0.98.0-hadoop2</version></dependency>

<dependency><groupId>org.apache.hbase</groupId><artifactId>hbase-common</artifactId><version>0.98.0-hadoop2</version></dependency>

<dependency><groupId>org.apache.hbase</groupId><artifactId>hbase-protocol</artifactId><version>0.98.0-hadoop2</version></dependency>

<dependency><groupId>org.apache.hbase</groupId><artifactId>hbase-server</artifactId><version>0.98.0-hadoop2</version></dependency>

<dependency><groupId>org.springframework</groupId><artifactId>spring-core</artifactId><version>4.2.3.RELEASE</version></dependency>

<dependency><groupId>org.springframework</groupId><artifactId>spring-context</artifactId><version>4.2.3.RELEASE</version></dependency>

<dependency><groupId>org.springframework</groupId><artifactId>spring-beans</artifactId><version>4.2.3.RELEASE</version></dependency>

</dependencies>



Error : 



  java.io.IOException: java.lang.reflect.InvocationTargetException
          at 
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416)

          at 
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:309)

          at 
simpleHBase.HBaseConnectionFactory.(HBaseConnectionFactory.java:99)
          at simpleHBase.HBaseClient.(HBaseClient.java:26)
          at simpleHBase.actionClass.main(actionClass.java:118) Caused 
by: java.lang.reflect.InvocationTargetException
          at 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
          at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

          at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

          at 
java.lang.reflect.Constructor.newInstance(Constructor.java:408)
          at 
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)

          ... 4 more Caused by: java.lang.ExceptionInInitializerError
          at 
org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
          at 
org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:69)

          at 
org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)

          at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)

          at 
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.(HConnectionManager.java:662)

          ... 9 more Caused by: java.lang.RuntimeException: Failed to 
create local dir /data0/hadoop/hbase/local/jars, DynamicClassLoader
  failed to init
          at 
org.apache.hadoop.hbase.util.DynamicClassLoader.(DynamicClassLoader.java:94)

          at 
org.apache.hadoop.hbase.protobuf.ProtobufUtil.(ProtobufUtil.java:201)
          ... 14 more


2 ACCEPTED SOLUTIONS

avatar
Super Guru

It looks like you have the value of the configuration property "hbase.local.dir" set in hbase-site.xml to "/data0/hadoop/hbase/local". The code checks to see if this directory exists, and, if it does not, creates it. If you are on a Linux system, it is likely that your client does not have permission to write to the root of the filesystem ("/").

The default value for "hbase.local.dir" is "/tmp/hbase-local-dir". You could consider setting a value for this property to a directory within "/tmp" as it should be writable by any user; however, any writable directory by the user running your code should be sufficient.

View solution in original post

avatar
Super Guru

Great! Glad to hear it's working for you now. I'd encourage you to up-vote and/or accept my answer as correct so other people know for the future.

View solution in original post

3 REPLIES 3

avatar
Super Guru

It looks like you have the value of the configuration property "hbase.local.dir" set in hbase-site.xml to "/data0/hadoop/hbase/local". The code checks to see if this directory exists, and, if it does not, creates it. If you are on a Linux system, it is likely that your client does not have permission to write to the root of the filesystem ("/").

The default value for "hbase.local.dir" is "/tmp/hbase-local-dir". You could consider setting a value for this property to a directory within "/tmp" as it should be writable by any user; however, any writable directory by the user running your code should be sufficient.

avatar
Contributor

awesome ... i just did that exactly and it worked . thanks for looking at it.

avatar
Super Guru

Great! Glad to hear it's working for you now. I'd encourage you to up-vote and/or accept my answer as correct so other people know for the future.