Community Articles

Find and share helpful community-sourced technical articles.
Labels (1)
avatar
Master Guru
TimothySpann_0-1608743864530.png
Sometimes you need real CDC and you have access to transaction change logs and you use a tool like QLIK REPLICATE or GoldenGate to pump out records to Kafka, and then Flink SQL or NiFi can read them and process them.

Other times you need something easier for just some basic changes and inserts to some tables you are interested in receiving new data as events. Apache NiFi can do this easily for you with QueryDatabaseTableRecord. You don't need to know anything, but the database connection information, table name and what field may change. NiFi will query, watch state and give you new records. Nothing is hardcoded, parameterize those values and you have a generic 'Any RDBMS' to 'Any Other Store' data pipeline. We are reading as records, which means each FlowFile in NiFi can have thousands of records that we know all the fields, types and schema related information for. This can be ones that NiFi infers the schema or ones we use to form a Schema Registry like Cloudera's amazing Open Source Schema Registry.

Let's see what data is in our PostgreSQL table:

TimothySpann_1-1608743864459.png

How To 

  • QueryDatabaseTableRecord (we will output JSON records, but could have done Parquet, XML, CSV or AVRO)
  • UpdateAttribute - optional - set a table and schema name, can do with parameters as well.
  • MergeRecord - optional - let's batch these up.
  • PutORC - let's send these records to HDFS (which could be on bare metal disks, GCS, S3, Azure or ADLS). This will build us an external Hive table.
TimothySpann_2-1608743864461.png

PutORC

TimothySpann_3-1608743864519.png

 

TimothySpann_4-1608743864488.png

As you can see, we are looking at the "prices" table and checking maximum values to increment on the updated_on date and the item_id sequential key. We then output JSON records.

 

TimothySpann_5-1608743864515.png
We could then:

Add-Ons Examples

  • PutKudu
  • PutHDFS (send as JSON, CSV, Parquet) and build an Impala or Hive table on top as external
  • PutHive3Streaming (Hive 3 ACID Tables)
  • PutS3
  • PutAzureDataLakeStorage
  • PutHBaseRecord
  • PublishKafkaRecord_2_* - send a copy to Kafka for Flink SQL, Spark Streaming, Spring, etc...
  • PutBigQueryStreaming (Google)
  • PutCassandraRecord
  • PutDatabaseRecord - let's send to another JDBC Datastore
  • PutDruidRecord - Druid is a cool datastore, check it out on CDP Public Cloud
  • PutElasticSearchRecord
  • PutMongoRecord
  • PutSolrRecord
  • PutRecord (to many RecordSinkServices like Databases, Kafka, Prometheus, Scripted and Site-to-Site)
  • PutParquet (store to HDFS as Parquet files)
You can do any number or all of these or multiple copies of each to other clouds or clusters. You can also do enrichment, transformation, alerts, queries or routing.

These records can be also manipulated ETL/ELT style with Record processing in stream with options such as:

  • QueryRecord (use Calcite ANSI SQL to query and transform records and can also change output type)
  • JoltTransformRecord (use JOLT against any record not just JSON)
  • LookupRecord (to match against Lookup services like caches, Kudu, REST services, ML models, HBase and more)
  • PartitionRecord (to break up into like groups)
  • SplitRecord (to break up record groups into records)
  • UpdateRecord (update values in fields, often paired with LookupRecord)
  • ValidateRecord (check against a schema and check for extra fields)
  • GeoEnrichIPRecord
  •  ConvertRecord (change between types like JSON to CSV)  

When you use PutORC, it will give you the details on building your external table. You can do a PutHiveQL to auto-build this table, but most companies want this done by a DBA.

 

CREATE EXTERNAL TABLE IF NOT EXISTS `pricesorc` (`item_id` BIGINT, `price` DOUBLE, `created_on` BIGINT, `updated_on` BIGINT)
STORED AS ORC
LOCATION '
/user/tspann/prices'

 

TimothySpann_6-1608743864510.png

 

TimothySpann_7-1608743864484.png

Part 2

REST to Database

Let's reverse this now. Sometimes you want to take data, say from a REST service and store it to a JDBC datastore.

  • InvokeHTTP (read from a REST endpoint)
  • PutDatabaseRecord (put JSON to our JDBC store).
That's it to store data to a database. We could add some of the ETL/ ELT enrichments mentioned above 
or others that manipulate content.
 
TimothySpann_8-1608743864523.png

REST Output

TimothySpann_9-1608743864539.png

Database Connection Pool

TimothySpann_10-1608743864497.png

Get the REST Data

TimothySpann_11-1608743864487.png

PutDatabaseRecord

TimothySpann_12-1608743864460.png

From ApacheCon 2020, John Kuchmek does a great talk on Incrementally Streaming RDBMS Data.

Resources

3,529 Views
Comments
avatar
New Contributor

Hi timothy, will you please help me with this question