Support Questions
Find answers, ask questions, and share your expertise

Facing issues with Date and Timestamp Columns

Am working on one requirement , it is simple loading the data from Hive to Postgresql using apache nifi.

My processors are SelectHiveQL--->SplitAvro--->PutDatabaseRecord.

SelectHiveQL get the data from Hive, in hive we have many columns but facing issues with Date Column and Timestamp Coloumns. In putDatabaseRecord am getting error as

":"PutDatabaseRecord[id=cecd3d18-f1d2-1d39-9e48-d4be96170480] Failed to process StandardFlowFileRecord[uuid=1bb1f4cf-3b06-4c0a-aaea-4156a3953ce9,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1510654697269-11, container=default, section=11], offset=186544, length=4932],offset=0,name=64863335688318.avro,size=4932] due to java.sql.BatchUpdateException: Batch entry 0 INSERT INTO some XXXX Columns was aborted: ERROR: column \"XXXdate\" is of type date but expression is of type character varying\nHint: You will need to rewrite or cast the expression.\nPosition: 932Call getNextException to see other errors in the batch.: ","timestamp":"06:14:09 EST"}}],"generated":"06:16:37 EST"}}

Would you please guide me to resolve the issue, do i need add any processor in the flow, how to come out of this issue

Thanks in advance.


@Abdelkrim Hadjidj any inputs on this question..Thanks in advance.

Can anybody please respond on my question, am new to Nifi

the issue i find here is am using SelectHiveQl Processor to connect with Hive and get the data while getting the date it is trying to read the Date Columns as String (

{"name":"join_date","type":["null","string"]}) this the main issue i found.

I have gone through few blogs regarding this they told me to use Update Atrribute processor to convert the string to Date as like below

but no result.

${field.value:toDate('yyyy-mm-dd HH:mm:ss.SSS'):toNumber()}

I have tried the same with QueryDatabaseTable Processor where we have an option of Boolean Property Use Avro Logical Types set to True but it is not worked it raised another issue.

QueryDatabaseTable[id=590aae62-0160-1000-ffff-ffffdb58c844] Unable to execute SQL select query SELECT * FROM test.DatesTable due to org.apache.nifi.processor.exception.ProcessException: Error during database query or conversion of records to Avro.: org.apache.nifi.processor.exception.ProcessException: Error during database query or conversion of records to Avro.

Am not sure how to resolve this issue. i found in one blog if the Avro format is like this (

{"name":"join_date","type":["null",{"type":"int","logicalType":"date"}]} ) it works but am not sure how to get this avro format with select HiveQL or QueryDatabaseTable processor.

Help is highly appreciated.


I have attached the XML for the reference , please check and let me know what am doing wrong here.

please help me to resolve the issue

New Contributor

change <Use Avro Logical Types==>true>

New Contributor

Hi, you need to add one param to your JDBC connection pooling service. Something like this: