Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Storm can't write to Hbase in DRPC mode, java.lang.NullPointerException

avatar
Contributor

Hi ,

I am trying to write Storm output to HBase. It's works fine when run in local mode, but when i switch to cluster mode, it got the error:

java.lang.NullPointerException at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:351...

I've also exported jar path in HADOOP variables, before to execute topology:

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:usr/hdp/current/hbase-client/lib/*.jar

without good results, same error is showed. Have any idea to solve it?

Thanks in advance. Giuseppe

1 ACCEPTED SOLUTION

avatar
Contributor

Hi All,

I've solved the issue, it is due to TableName variable declared as static in bolt statement and initialized in constructor.

In the method prepare() of bolt, it was "null" in DRPC mode because, I suppose, Storm is stateless and so no pointer to static memory is present. I have changed static to private and now works fine.

Thanks anyway.

View solution in original post

4 REPLIES 4

avatar
Super Guru

Any chance you can edit your question to include the complete Java stack trace, please?

avatar
Contributor

Hi Emil and Josh, thanks for your reply. I use hbase-client version 1.1.1 and I've developed a custom HbaseBolt that implemented IRichBolt interface. I have included hbase-site.xml in project to read configuration of cluster. In fact, when I create the configuration and then the connection, I read all configuration correcly:

Configuration config = HBaseConfiguration.create();
connection = ConnectionFactory.createConnection(config);

I've also checked that the xml file is added to jar root so should be read when topology is running. As write above, it works fine in local mode but issued in DRPC. This a trace of log that I can see throught Storm UI:

2016-05-02 07:20:56.685 o.a.z.ZooKeeper [INFO] Client environment:java.io.tmpdir=/tmp
2016-05-02 07:20:56.685 o.a.z.ZooKeeper [INFO] Client environment:java.compiler=<NA>
2016-05-02 07:20:56.685 o.a.z.ZooKeeper [INFO] Client environment:os.name=Linux
2016-05-02 07:20:56.685 o.a.z.ZooKeeper [INFO] Client environment:os.arch=amd64
2016-05-02 07:20:56.685 o.a.z.ZooKeeper [INFO] Client environment:os.version=3.10.0-229.7.2.el7.x86_64
2016-05-02 07:20:56.685 o.a.z.ZooKeeper [INFO] Client environment:user.name=storm
2016-05-02 07:20:56.685 o.a.z.ZooKeeper [INFO] Client environment:user.home=/home/storm
2016-05-02 07:20:56.685 o.a.z.ZooKeeper [INFO] Client environment:user.dir=/home/storm
2016-05-02
 07:20:56.686 o.a.z.ZooKeeper [INFO] Initiating client connection, 
connectString=swrmaster01.northeurope.cloudapp.azure.com:2181,swrmaster02.northeurope.cloudapp.azure.com:2181,swrmaster03.northeurope.cloudapp.azure.com:2181
 sessionTimeout=90000 watcher=hconnection-0x5f1bd8ec0x0, 
quorum=swrmaster01.northeurope.cloudapp.azure.com:2181,swrmaster02.northeurope.cloudapp.azure.com:2181,swrmaster03.northeurope.cloudapp.azure.com:2181,
 baseZNode=/hbase-unsecure
2016-05-02 07:20:56.698 o.a.z.ClientCnxn 
[INFO] Opening socket connection to server 
swrmaster03.northeurope.cloudapp.azure.com/10.2.0.13:2181. Will not 
attempt to authenticate using SASL (unknown error)
2016-05-02 
07:20:56.699 o.a.z.ClientCnxn [INFO] Socket connection established to 
swrmaster03.northeurope.cloudapp.azure.com/10.2.0.13:2181, initiating 
session
2016-05-02 07:20:56.719 o.a.z.ClientCnxn [INFO] Session 
establishment complete on server 
swrmaster03.northeurope.cloudapp.azure.com/10.2.0.13:2181, sessionid = 
0x3546dbd44bb0011, negotiated timeout = 40000
2016-05-02 07:20:57.192
 o.a.h.h.s.DomainSocketFactory [WARN] The short-circuit local reads 
feature cannot be used because libhadoop cannot be loaded.
2016-05-02 07:20:57.255 b.s.util [ERROR] Async loop died!
java.lang.NullPointerException
   at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:351) ~[stormjar.jar:?]
   at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:318) ~[stormjar.jar:?]

   at 
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getTable(ConnectionManager.java:726)
 ~[stormjar.jar:?]
   at 
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getTable(ConnectionManager.java:708)
 ~[stormjar.jar:?]
   at 
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getTable(ConnectionManager.java:542)
 ~[stormjar.jar:?]
   at com.ecube.swarco.stormtopology.hbase.SignalGroupsHBaseBolt.prepare(SignalGroupsHBaseBolt.java:60) ~[stormjar.jar:?]

   at 
backtype.storm.daemon.executor$fn__3697$fn__3710.invoke(executor.clj:746)
 ~[storm-core-0.10.0.2.3.4.0-3485.jar:0.10.0.2.3.4.0-3485]
   at backtype.storm.util$async_loop$fn__544.invoke(util.clj:473) [storm-core-0.10.0.2.3.4.0-3485.jar:0.10.0.2.3.4.0-3485]
   at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
   at java.lang.Thread.run(Thread.java:745) [?:1.8.0_60]
2016-05-02 07:20:57.260 b.s.d.executor [ERROR] 
java.lang.NullPointerException
   at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:351) ~[stormjar.jar:?]
   at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:318) ~[stormjar.jar:?]

   at 
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getTable(ConnectionManager.java:726)
 ~[stormjar.jar:?]
   at 
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getTable(ConnectionManager.java:708)
 ~[stormjar.jar:?]
   at 
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getTable(ConnectionManager.java:542)
 ~[stormjar.jar:?]
   at com.ecube.swarco.stormtopology.hbase.SignalGroupsHBaseBolt.prepare(SignalGroupsHBaseBolt.java:60) ~[stormjar.jar:?]

   at 
backtype.storm.daemon.executor$fn__3697$fn__3710.invoke(executor.clj:746)
 ~[storm-core-0.10.0.2.3.4.0-3485.jar:0.10.0.2.3.4.0-3485]
   at backtype.storm.util$async_loop$fn__544.invoke(util.clj:473) [storm-core-0.10.0.2.3.4.0-3485.jar:0.10.0.2.3.4.0-3485]
   at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
   at java.lang.Thread.run(Thread.java:745) [?:1.8.0_60]
2016-05-02
 07:20:57.318 b.s.d.executor [INFO] Processing received message FOR 3 
TUPLE: source: signalgroupsSpout:9, stream: __ack_init, id: {}, 
[-2915561833589860894, 6619686113940453962, 9]
2016-05-02 
07:20:57.318 b.s.d.executor [INFO] BOLT ack TASK: 3 TIME: 0 TUPLE: 
source: signalgroupsSpout:9, stream: __ack_init, id: {}, 
[-2915561833589860894, 6619686113940453962, 9]
2016-05-02 
07:20:57.322 b.s.d.executor [INFO] Execute done TUPLE source: 
signalgroupsSpout:9, stream: __ack_init, id: {}, [-2915561833589860894, 
6619686113940453962, 9] TASK: 3 DELTA: 
2016-05-02 07:20:57.326 
b.s.d.executor [INFO] Processing received message FOR 3 TUPLE: source: 
signalGroupsFilterBolt:5, stream: __ack_ack, id: {}, 
[-2915561833589860894, 6619686113940453962]
2016-05-02 07:20:57.327 b.s.d.task [INFO] Emitting direct: 9; __acker __ack_ack [-2915561833589860894]
2016-05-02
 07:20:57.327 b.s.d.executor [INFO] TRANSFERING tuple TASK: 9 TUPLE: 
source: __acker:3, stream: __ack_ack, id: {}, [-2915561833589860894]
2016-05-02
 07:20:57.327 b.s.d.executor [INFO] BOLT ack TASK: 3 TIME:  TUPLE: 
source: signalGroupsFilterBolt:5, stream: __ack_ack, id: {}, 
[-2915561833589860894, 6619686113940453962]
2016-05-02 07:20:57.327 
b.s.d.executor [INFO] Execute done TUPLE source: 
signalGroupsFilterBolt:5, stream: __ack_ack, id: {}, 
[-2915561833589860894, 6619686113940453962] TASK: 3 DELTA: 
2016-05-02 07:20:57.366 b.s.util [ERROR] Halting process: ("Worker died")
java.lang.RuntimeException: ("Worker died")
   at backtype.storm.util$exit_process_BANG_.doInvoke(util.clj:332) [storm-core-0.10.0.2.3.4.0-3485.jar:0.10.0.2.3.4.0-3485]
   at clojure.lang.RestFn.invoke(RestFn.java:423) [clojure-1.6.0.jar:?]

   at 
backtype.storm.daemon.worker$fn__5927$fn__5928.invoke(worker.clj:636) 
[storm-core-0.10.0.2.3.4.0-3485.jar:0.10.0.2.3.4.0-3485]
   at 
backtype.storm.daemon.executor$mk_executor_data$fn__3530$fn__3531.invoke(executor.clj:256)
 [storm-core-0.10.0.2.3.4.0-3485.jar:0.10.0.2.3.4.0-3485]
   at backtype.storm.util$async_loop$fn__544.invoke(util.clj:485) [storm-core-0.10.0.2.3.4.0-3485.jar:0.10.0.2.3.4.0-3485]
   at clojure.lang.AFn.run(AFn.java:22) [clojure-1.6.0.jar:?]
   at java.lang.Thread.run(Thread.java:745) [?:1.8.0_60]
2016-05-02
 07:20:57.368 b.s.d.worker [INFO] Shutting down worker 
swarco-storm-topology_v01_test-5-1462173409 
e3fe85d7-fecd-434e-b27e-e0e93dc76c6a 6701

Do you have any idea to solve it?

Thanks again,

Giuseppe

avatar
Contributor

No Emil, I don't set hbase.rootdir and zookeeper.znode.parent in storm config because, I think, these are read by xml. I try to do it.

I have create custom hbaseBolt because I need to customize and apply some operation before to store in hbase and this cannot be do used SimpleHbaseMapper and HbaseBolt integrated in Bolt.

avatar
Contributor

Hi All,

I've solved the issue, it is due to TableName variable declared as static in bolt statement and initialized in constructor.

In the method prepare() of bolt, it was "null" in DRPC mode because, I suppose, Storm is stateless and so no pointer to static memory is present. I have changed static to private and now works fine.

Thanks anyway.