Member since
12-18-2016
3
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1044 | 12-18-2016 07:45 PM |
12-18-2016
07:45 PM
2 Kudos
FIXED. I updated the configuration as following: Map<String, Object> HBConf = new HashMap<String,Object>();
HBConf.put("hbase.zookeeper.quorum", "server1, server2, server3");
/*Find server1,server2,server3 in your configuration ambari>HBASE>advanced>hbase.zookeeper.quorum */
HBConf.put("hbase.zookeeper.property.clientPort", "2181");
HBConf.put("hbase.master.port", "16000");
/*Find zookeeper.znode.parent in your configuration ambari>HBASE>advanced>zookeeper.znode.parent */
HBConf.put("zookeeper.znode.parent", "/hbase-unsecure");
config.put("HBCONFIG",HBConf);
... View more
12-18-2016
06:19 PM
Hello, I created the following topology: Config config = new Config();
config.setClasspath("/usr/hdp/2.5.0.0-1245/hbase/lib");
config.setDebug(true);
config.put(Config.TOPOLOGY_MAX_SPOUT_PENDING, 1);
Map<String, Object> HBConf = new HashMap<String,Object>();
HBConf.put("hbase.rootdir","hdfs://localhost:8020/apps/hbase/data");
HBConf.put("zookeeper.znode.parent", "/hbase");
config.put("HBCONFIG",HBConf);
//TEST HBASE
SimpleHBaseMapper mapper = new SimpleHBaseMapper()
.withRowKeyField("row")
.withColumnFields(new Fields("driverName"))
.withColumnFamily("events");
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("word-spout", new WordGenerator());
builder.setBolt("pre-hive", new PrepareTuple()).shuffleGrouping("word-spout");
builder.setBolt("hbase-bolt", new HBaseBolt("driver_dangerous_event", mapper).withConfigKey("HBCONFIG")).shuffleGrouping("pre-hive");
LocalCluster cluster = new LocalCluster();
cluster.submitTopology("HelloStorm", config, builder.createTopology());
This topology is built with three components: WordGenerator - Just generates random word), PrepareTuple - Prepare data to be inserted in HBase. It generates tree fields used in SimpleHBaseMapper class public void execute(Tuple input) {
this.collector.emit(new Values(id,input.getString(0), "events" ));
id=id+1;
System.out.println("###PRE-HBASE### Emitting tuple:"+ Integer.toString(id) +","+ user.getNome());
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields("row","driverName","events"));
}
The idea is to create a tuple to be written in the "driver_dangerous_event" HBASE table. Unfortunately, no rows are inserted in the table. This is the log of my topology but I can't see errors: ###PRE-HBASE### Emitting tuple:2,Word 1
7767 [Thread-16-pre-hive-executor[3 3]] INFO o.a.s.d.executor - Execute done TUPLE source: word-spout:4, stream: default, id: {}, [Word 1] TASK: 3 DELTA:
7776 [Thread-18-__acker-executor[1 1]] INFO o.a.s.d.executor - Preparing bolt __acker:(1)
7787 [Thread-20-__system-executor[-1 -1]] INFO o.a.s.d.executor - Preparing bolt __system:(-1)
7797 [Thread-18-__acker-executor[1 1]] INFO o.a.s.d.executor - Prepared bolt __acker:(1)
7800 [Thread-20-__system-executor[-1 -1]] INFO o.a.s.d.executor - Prepared bolt __system:(-1)
8341 [Thread-14-hbase-bolt-executor[2 2]] WARN o.a.h.u.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
8752 [Thread-14-hbase-bolt-executor[2 2]] INFO o.a.h.h.z.RecoverableZooKeeper - Process identifier=hconnection-0x2ea63a1 connecting to ZooKeeper ensemble=localhost:2181
9193 [Thread-14-hbase-bolt-executor[2 2]] INFO o.a.s.d.executor - Prepared bolt hbase-bolt:(2)
9193 [Thread-14-hbase-bolt-executor[2 2]] INFO o.a.s.d.executor - Processing received message FOR 2 TUPLE: source: pre-hive:3, stream: default, id: {}, [2, Word 1, events]
9198 [Thread-14-hbase-bolt-executor[2 2]] INFO o.a.s.d.executor - Execute done TUPLE source: pre-hive:3, stream: default, id: {}, [2, Word 1, events] TASK: 2 DELTA:
9198 [Thread-14-hbase-bolt-executor[2 2]] INFO o.a.s.d.executor - Processing received message FOR -2 TUPLE: source: __system:-1, stream: __tick, id: {}, [1]
12709 [Thread-22-word-spout-executor[4 4]] INFO o.a.s.d.task - Emitting: word-spout default [Word 1]
12710 [Thread-22-word-spout-executor[4 4]] INFO o.a.s.d.executor - TRANSFERING tuple [dest: 3 tuple: source: word-spout:4, stream: default, id: {}, [Word 1]]
Is There something wrong here? Thanks
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Storm
12-18-2016
03:10 PM
Hello, I trying to create an HBASE bolt. here the code: Config config = new Config();
config.setDebug(true);
config.put(Config.TOPOLOGY_MAX_SPOUT_PENDING, 1);
Map<String, Object> HBConf = new HashMap<String,Object>();
HBConf.put("hbase.rootdir","hdfs://localhost:8020/apps/hbase/data");
HBConf.put("hbase.zookeeper.property.clientPort","2181");
HBConf.put("hbase.master", "localhost:60000");
config.put("HBCONFIG",HBConf);
//TEST HBASE
SimpleHBaseMapper mapper = new SimpleHBaseMapper()
.withRowKeyField("nome")
.withColumnFields(new Fields("cognome"))
.withColumnFamily("cf");
HBaseBolt hbase = new HBaseBolt("WordCount", mapper).withConfigKey("HBCONFIG");
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("word-spout", new WordGenerator());
builder.setBolt("pre-hive", new PrepareTuple()).shuffleGrouping("word-spout");
builder.setBolt("hbase-bolt", hbase).shuffleGrouping("pre-hive");
LocalCluster cluster = new LocalCluster();
cluster.submitTopology("HelloStorm", config, builder.createTopology());
The HBASE bolt should write on the existing table WordCount. I can see the table in the result of hbase shell command list; When I run my topology I have the following error: 8019 [Thread-14-hbase-bolt-executor[2 2]] ERROR o.a.h.h.c.AsyncProcess - Cannot get replica 0 location for {"totalColumns":1,"row":"Storm apache","families":{"cf":[{"qualifier":"cognome","vlen":7,"tag":[],"timestamp":9223372036854775807}]}}
org.apache.hadoop.hbase.TableNotFoundException: WordCount
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1264) ~[hbase-client-1.1.2.2.5.0.0-1245.jar:1.1.2.2.5.0.0-1245]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1162) ~[hbase-client-1.1.2.2.5.0.0-1245.jar:1.1.2.2.5.0.0-1245]
at org.apache.hadoop.hbase.client.AsyncProcess$AsyncRequestFutureImpl.findAllLocationsOrFail(AsyncProcess.java:958) [hbase-client-1.1.2.2.5.0.0-1245.jar:1.1.2.2.5.0.0-1245]
at org.apache.hadoop.hbase.client.AsyncProcess$AsyncRequestFutureImpl.groupAndSendMultiAction(AsyncProcess.java:866) [hbase-client-1.1.2.2.5.0.0-1245.jar:1.1.2.2.5.0.0-1245]
at org.apache.hadoop.hbase.client.AsyncProcess$AsyncRequestFutureImpl.access$100(AsyncProcess.java:584) [hbase-client-1.1.2.2.5.0.0-1245.jar:1.1.2.2.5.0.0-1245]
at org.apache.hadoop.hbase.client.AsyncProcess.submitAll(AsyncProcess.java:566) [hbase-client-1.1.2.2.5.0.0-1245.jar:1.1.2.2.5.0.0-1245]
at org.apache.hadoop.hbase.client.HTable.batch(HTable.java:906) [hbase-client-1.1.2.2.5.0.0-1245.jar:1.1.2.2.5.0.0-1245]
at org.apache.storm.hbase.common.HBaseClient.batchMutate(HBaseClient.java:101) [storm-hbase-1.0.2.jar:1.0.2]
at org.apache.storm.hbase.bolt.HBaseBolt.execute(HBaseBolt.java:96) [storm-hbase-1.0.2.jar:1.0.2]
at org.apache.storm.daemon.executor$fn__6571$tuple_action_fn__6573.invoke(executor.clj:734) [storm-core-1.0.1.2.5.0.0-1245.jar:1.0.1.2.5.0.0-1245]
at org.apache.storm.daemon.executor$mk_task_receiver$fn__6492.invoke(executor.clj:469) [storm-core-1.0.1.2.5.0.0-1245.jar:1.0.1.2.5.0.0-1245]
at org.apache.storm.disruptor$clojure_handler$reify__6005.onEvent(disruptor.clj:40) [storm-core-1.0.1.2.5.0.0-1245.jar:1.0.1.2.5.0.0-1245]
at org.apache.storm.utils.DisruptorQueue.consumeBatchToCursor(DisruptorQueue.java:451) [storm-core-1.0.1.2.5.0.0-1245.jar:1.0.1.2.5.0.0-1245]
at org.apache.storm.utils.DisruptorQueue.consumeBatchWhenAvailable(DisruptorQueue.java:430) [storm-core-1.0.1.2.5.0.0-1245.jar:1.0.1.2.5.0.0-1245]
at org.apache.storm.disruptor$consume_batch_when_available.invoke(disruptor.clj:73) [storm-core-1.0.1.2.5.0.0-1245.jar:1.0.1.2.5.0.0-1245]
at org.apache.storm.daemon.executor$fn__6571$fn__6584$fn__6637.invoke(executor.clj:853) [storm-core-1.0.1.2.5.0.0-1245.jar:1.0.1.2.5.0.0-1245]
at org.apache.storm.util$async_loop$fn__554.invoke(util.clj:484) [storm-core-1.0.1.2.5.0.0-1245.jar:1.0.1.2.5.0.0-1245]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_111]
Where can be the error? Thanks
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Storm