Support Questions

Find answers, ask questions, and share your expertise

Oracle to PostgreSQL Updates(CDC) throws error when using PutDatabaseRecord

New Contributor

Hello Community Team,


I am injecting Data From Oracle to PostgreSQL using NiFi. I am trying to capture inserts/updates (CDC) using the flow below.

GenerateTableFetch-->ExecuteSQL-->SplitRecord-->PutDatabaseRecord(Insert)-->PutDatabaseRecord(Update).

I am connecting second PutDatabaseRecord (Update) Processor to the failure of the first PutDatabaseRecord (Insert) Processor.

I made the following settings:

In ExecuteSQL, "Normalize Table/Column Names" to FALSE AND "Use Avro Logical Types" to TRUE

In PutDatabaseRecord, "Translate Field Names" to TRUE.

When Injecting Data From (Oracle to Oracle) or (PostgreSQL to PostgreSQL) are working fine for both Inserts and Updates.

Issue comes only when Data is injected from (Oracle to PostgreSQL) or (PostgreSQL to Oracle), only for "UPDATE" (Second PutDatabaseRecord).

When "Translate Field Names" is set to TRUE, it throws error that the column defined in "Update Keys" doesn't exist. It is actually Normalizing the Column Name From (user_info_id) to (USERINFOID). It is removing the underscores and making it as Upper Case.

When "Translate Field Names" is set to FALSE, it throws error that "No SET and WHERE Conditions exist in UPDATE query."

PutDatabaseRecord[id=016a1000-a460-111c-2c0e-edfab162e142] Failed to process StandardFlowFileRecord[uuid=769a00f2-e369-4c4f-aba6-686b2d10d892,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1555481494354-34, container=default, section=34], offset=526460, length=2510],offset=0,name=631e7185-e7bb-4e43-91e8-448e02b7bdac,size=2510] due to Batch entry 0 UPDATE nifi_target.test_table SET  WHERE  was aborted: ERROR: syntax error at or near "WHERE"  Positio
n: 36  Call getNextException to see other errors in the batch.: 


I guess When Injecting Data From (Oracle to Oracle) or (PostgreSQL to PostgreSQL) are working fine for "UPDATE" because ExecuteSQL is connected to Source connection details, Normalizing column names is set to "FALSE" and Target is also connected to same DB Type (Oracle to Oracle), hence normalization of columns were stopped in the PutDatabaseRecord. But, If Source and Target are different, then Normalization is effective and causing the issue.


Please help me to find the solution here.

Thanks in advance.

1 REPLY 1

New Contributor

@Shu

can you please respond to this question.

My only worry is to keep normalization of the columns to "FALSE" in the flow before PutDatabaseRecord. Any Other Processor can do this without changing the data flow. Please respond.