Community Articles

Find and share helpful community-sourced technical articles.
Announcements
Celebrating as our community reaches 100,000 members! Thank you!
avatar
Master Guru

ETL With Lookups with Apache HBase and Apache NiFi

(Microservices Style ETL)

When we are ingesting tabular / record-oriented data, we often want to enrich the data by replacing ids with descriptions or visa-versa. There are many transformations that may need to happen before the data is in a happy state. When you are denormalizing your data in Hadoop and usually building very wide tables you often want descriptions or other data to enhance it's usability. Only one call to get everything you need is nice, especially when you have 100 trillion records.

We are utilizing a lot of things built already (https://community.hortonworks.com/articles/146198/data-flow-enrichment-with-nifi-part-3-lookuprecord...). Make sure you read Abdelkrim's first 3 lookup articles. I added some fields to his generated data for testing.

I want to do my lookups against HBase which is a great NoSQL store for lookup tables and generate datasets.

First I created an HBase Table to use for lookups.


Create HBase Table For Lookups

create 'lookup_', 'family'


Table With Data
72499-lookuphbasetable.png

Most people would have a pre-populated table for lookups. I don't and since we are using a generator to build the lookup ids, I am building the lookup descriptions with a REST CALL at the same time. We could also have a flow that if you don't find the lookup add it, we could also have another flow ingesting the lookup values and add/update those when needed.

REST API To Generate Product Descriptions

https://baconipsum.com/api/?type=meat&sentences=1&format=text

I found this cool API that returns a sentence of meat words. I use this as our description, because MEAT!

Call the Bacon API!!!

72507-lookupinvokehttp.png

Let's turn our plain text into a clean JSON document
72508-lookupreplacetext.png

Then I store it in HBase as my lookup table. You probably already have a lookup table. This is a demo and I am filling it with my generator. This is not a best practice or a good design pattern. This is a lazy way to populate a table.

Example Apache NiFi Flow (Using Apache NiFi 1.5)

72498-lookupflow.png

Generate Some Test Data (https://community.hortonworks.com/articles/146198/data-flow-enrichment-with-nifi-part-3-lookuprecord...)

72500-generateflowlookup.png


Generate A Json Document (Note the Empty prod_desc)

{
"ts" : "${now():format('yyyymmddHHMMSS')}",
"updated_dt" : "${now()}",
"id_store" : ${random():mod(5):toNumber():plus(1)},
"event_type" : "generated",
"uuid" : "${UUID()}",
"hostname" : "${hostname()}",
"ip" : "${ip()}",
"counter" : "${nextInt()}",
"id_transaction" : "${random():toString()}",
"id_product" : ${random():mod(500000):toNumber()},
"value_product" : ${now():toNumber()},
"prod_desc": ""
}


Lookup Your Record

72501-lookuprecord.png

This is the magic. We take in our records, in this case we are reading JSON records and writing JSON records, we could choose CSV, AVRO or others. We connect to the HBase Record Lookup Service. We replace the current prod_desc field in the record with what is returned by the lookup. We use the id_product field as the lookup key. There is nothing else needed to change records in stream.

HBase Record Lookup Service

72509-lookupputhbasejson.png

HBase Client Service Used by HBase Record Lookup Service

72502-hbaserecordlookupservice.png

We can use UpdateRecord to cleanup, transform or modify any field in the records in stream.

72505-updaterecord2.png

72506-updaterecord.png

Original File

{
"ts" : "201856271804499",
"updated_dt" : "Fri Apr 27 18:56:15 UTC 2018",
"id_store" : 1,
"event_type" : "generated",
"uuid" : "0d16967d-102d-4864-b55a-3f1cb224a0a6",
"hostname" : "princeton1",
"ip" : "172.26.217.170",
"counter" : "7463",
"id_transaction" : "5307056748245491959",
"id_product" : 430672,
"value_product" : 1524855375500,
"prod_desc": ""
}

Final File (Note we have populated prod_desc with MEAT!)

[ {
  "ts" : "201856271804499",
  "prod_desc" : "Pork chop leberkas brisket chuck, filet mignon turducken hamburger.",
  "updated_dt" : "Fri Apr 27 18:56:15 UTC 2018",
  "id_store" : 1,
  "event_type" : "generated",
  "uuid" : "0d16967d-102d-4864-b55a-3f1cb224a0a6",
  "hostname" : "princeton1",
  "ip" : "172.26.217.170",
  "counter" : "7463",
  "id_transaction" : "5307056748245491959",
  "id_product" : 430672,
  "value_product" : 1524855375500
} ]

References:

For those wishing to not include meat in their data, there are alternatives: https://www.vegguide.org/site/api-docs

Example Flow

etlv2.xml


hbaserecordlookupservice.pnghbaseclient.png
6,596 Views