Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

PutDatabaseRecord and mix of null and not null timestamps not working

avatar
Explorer

I'm attempting to write to Implala using PutDatabaseRecord. Everything works fine until I hit a null timestamp, and it tries to cast it to a varchar(19) instead of a timestamp.

The data input is from an Oracle DB via QueryDatabaseTable. This generates the following avro embedded schema:

{"type":"record","name":"BLAH","namespace":"TEST","fields":[{"name":"ID","type":["null","string"]},{"name":"MY_VALUE","type":["null",{"type":"bytes","logicalType":"decimal","precision":15,"scale":4}]},{"name":"TIME_INSERTED","type":["null",{"type":"long","logicalType":"timestamp-millis"}]},{"name":"TIME_UPDATED","type":["null",{"type":"long","logicalType":"timestamp-millis"}]}]}

The actual data is then:

[ {
  "ID" : "71416037",
  "MY_VALUE" : 58.75,
  "TIME_INSERTED" : "2018-05-11T22:17:44.000Z",
  "TIME_UPDATED" : null
}, {
  "ID" : "71416525",
  "MY_VALUE" : 267.5,
  "TIME_INSERTED" : "2018-05-11T22:18:25.000Z",
  "TIME_UPDATED" : "2018-05-15T22:09:37.385Z"
} ]

Taken in isolation, the second entry with no null values works fine, and the first entry with null works fine, but when trying with both together it fails. Looking at the generated SQL, it appears it's trying to cast the timestamp to a char(19) rather than a timestamp. What's a little confusing is the error talks about the expression null being of type char(19), but there's no cast around the null value.

Target table 'test' is incompatible with source expressions.
Expression 'null' (type: CHAR(19)) is not compatible with column 'time_updated' (type: TIMESTAMP)
), Query: INSERT INTO `test`(`id`, `my_value`, `time_inserted`, `time_updated`) VALUES (CAST('71416037' AS CHAR(8)), 58.75, '2018-05-11 18:17:44.0', null), (CAST('71416525' AS CHAR(8)), 267.5, '2018-05-11 18:18:25.0', CAST('2018-05-15 18:09:37' AS CHAR(19))).: 

Any ideas? It seems strange that either in isolation is fine but both together is not.

2 REPLIES 2

avatar
Master Guru

What version of NiFi are you using, and what does the CREATE TABLE statement for the source and target tables look like? Is Oracle your target DB or is it a different DB? I ran with ExecuteSQL against Oracle 11 (with NiFi's master branch, so 1.6.0+ or "almost 1.7.0"), populated with your actual data (using the same PutDatabaseRecord but with a JsonTreeReader). It generated the same Avro schema you have above with the same data, I changed PutDatabaseRecord to use an AvroReader with Use Embedded Schema, and everything ran fine, inserting the rows successfully.

I'm guessing you have an older version of NiFi that might be missing some fixes and/or improvements around logical type (timestamp, e.g.) handling.

avatar
Explorer

I'm using Nifi 1.5. I don't have the create statement for the source (it's an oracle view) but the field in question is a TIMESTAMP(6) in the source DB. The target DB is Impala; the create table for that is

create table test
( id STRING, my_value DOUBLE, time_inserted TIMESTAMP, time_updated TIMESTAMP)
STORED AS PARQUET
LOCATION '/blah/test';

As the Avro input fields are in uppercase, the PutDatabaseRecord processor is using Translate Field Names and I've also set it to quote both column and table identifiers. Unmatched fields/columns are set to ignore.

I'll have a try later with 1.6 and see if that solves it, or if not will have a go at a 1.7 build. In the meantime I've split it out into two separate QueryDatabaseTable processors, one for null and one for not null (luckily time_updated is the only nullable timestamp!)

Thanks,
Toby