Hitting a weird issue in Nifi that I can't get my head around.
I am consuming AVRO records from Kafka with a timestamp field. The definition is
{
"name" : "timestamp",
"type" : {
"type" : "long",
"logicalType" : "timestamp-millis"
}
},
If I use a JSON record writer then in the queue viewer I can see the field contains the original long value. However if I use an AVRO writer the field is formatted as a date/time value, e.g. 2025-09-26T15:29:44.466Z.
I assumed this was just pretty-printing in the queue viewer. But if I then send those flowfiles to a ConvertRecord process with an AVRO reader and a FreeText writer the timestamp field is now displayed as YYYY-mm-DD HH:MM:SS. Which means I've lost precision and I need to use expression language to re-parse the field to convert it back to a long value.
So, I'm forced to us JSON as the intermediary format. If I use AVRO I loose precision.
This is Nifi 2.5.