I am connecting second PutDatabaseRecord (Update) Processor to the failure of the first PutDatabaseRecord (Insert) Processor.
I made the following settings:
In ExecuteSQL, "Normalize Table/Column Names" to FALSE AND "Use Avro Logical Types" to TRUE
In PutDatabaseRecord, "Translate Field Names" to TRUE.
When Injecting Data From (Oracle to Oracle) or (PostgreSQL to PostgreSQL) are working fine for both Inserts and Updates.
Issue comes only when Data is injected from (Oracle to PostgreSQL) or (PostgreSQL to Oracle), only for "UPDATE" (Second PutDatabaseRecord).
When "Translate Field Names" is set to TRUE, it throws error that the column defined in "Update Keys" doesn't exist. It is actually Normalizing the Column Name From (user_info_id) to (USERINFOID). It is removing the underscores and making it as Upper Case.
When "Translate Field Names" is set to FALSE, it throws error that "No SET and WHERE Conditions exist in UPDATE query."
PutDatabaseRecord[id=016a1000-a460-111c-2c0e-edfab162e142] Failed to process StandardFlowFileRecord[uuid=769a00f2-e369-4c4f-aba6-686b2d10d892,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1555481494354-34, container=default, section=34], offset=526460, length=2510],offset=0,name=631e7185-e7bb-4e43-91e8-448e02b7bdac,size=2510] due to Batch entry 0 UPDATE nifi_target.test_table SET WHERE was aborted: ERROR: syntax error at or near "WHERE" Positio
n: 36 Call getNextException to see other errors in the batch.:
I guess When Injecting Data From (Oracle to Oracle) or (PostgreSQL to PostgreSQL) are working fine for "UPDATE" because ExecuteSQL is connected to Source connection details, Normalizing column names is set to "FALSE" and Target is also connected to same DB Type (Oracle to Oracle), hence normalization of columns were stopped in the PutDatabaseRecord. But, If Source and Target are different, then Normalization is effective and causing the issue.