- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Does nifi (HDF 1.2) have a processor to read records from a hbase table based on a rowkey?
Created 04-18-2016 06:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Does nifi (HDF 1.2) have a processor to read records from a hbase table based on a rowkey?
Created 04-18-2016 06:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
There is a GetHBase processor described here:
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.hbase.GetHBase/index.html
It is intended to do an incremental retrieval from an HBase table.
Does this work for your use-case, or are you looking for something different?
Created 04-18-2016 06:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Anoop Nair Take a look at the getHbase processor here. Use filter on rowkey.
Created 04-18-2016 06:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
There is a GetHBase processor described here:
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.hbase.GetHBase/index.html
It is intended to do an incremental retrieval from an HBase table.
Does this work for your use-case, or are you looking for something different?
Created 04-18-2016 06:56 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm not looking at a polling processor. The requirement is to query hbase on a specfic rowkey on demand basis.
Created 04-18-2016 07:23 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
What do you mean by on-demand basis?
Created 04-18-2016 09:46 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The requirement is to read data from a kafka topic. The data has the rowkey that should be used to query data from a hbase table and send the column value to a JMS queue. Here is the flow:
Getkafka processor -> Gethbase -> Publish JMS
The problem is GetHbase processor cannot be an intermediate processor in the above chain.
Created 12-21-2016 04:34 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Anoop Nair Are you able to create gethasebykey? Appreciate if you could share your experience..We have similar use case
Created 12-21-2016 01:55 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Theena R can you provide some input about your use-case...
- Are you receiving messages from somewhere (Kafaka/JMS/etc) and those messages have a row key that you want to fetch?
- Are you looking to fetch the whole row (i.e. multiple cells) or do you know a specific col-family:col-qualifier and only want to fetch a single cell?
- What type of output would you expect? something similar to the JSON representation that GetHBase produces?
Created 12-21-2016 02:24 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Bryan Bende Yes. On day 1, We are receiving JSON record as input which has a key and load into HBase using HbbasePut JSON. 2. On day 2, we get the same message with few changes. Need to determine if any changes and emit as delta to another processor. I am looking to query Hbase based on row key and get all other columns from Hbase and compare in the next processor (JSON path eval) and route as delta if there is any change in the values. GetHbase only scans a table and generates flowfile. I need to query on-demand based on a row query and get JSON representation as output.
Created 04-19-2016 06:30 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I created this JIRA:
https://issues.apache.org/jira/browse/NIFI-1784
How would you want the data for the row to be represented in a NiFi flow file?
JSON document where the key/value pairs are column qualifiers / values?