Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Nifi Processor : Insert data into CLOB column

Nifi Processor : Insert data into CLOB column

New Contributor

I have data pipeline to consume message from Kafka and insert into oracle database. The message from Kafka is in JSON format. If any error occurs during the processing then insert whole message (flowfile content) to invalid payload table. The trimmed version of the pipeline is as follows GenerateFlowFile => ReplaceText => PutSQL.

clob-insert-test.xml

The table structure is as follows

CREATE TABLE CLOB_TEST

("TRAN_ID" VARCHAR2(36 BYTE) NOT NULL ENABLE,

"PAYLOAD" CLOB NOT NULL ENABLE,

CONSTRAINT "CLOB_TEST" PRIMARY KEY ("TRAN_ID")

)

Payload is more than 4000 characters. Creating insert SQL using replacetext failed and error is as follows

Error report -

SQL Error: ORA-01704: string literal too long

01704. 00000 -"string literal too long"

*Cause:The string literal is longer than 4000 characters.

*Action: Use a string literal of at most 4000 characters.

Longer values may only be entered using bind variables.

is there any other option to bind CLOB column and insert to the table. Any help will be greatly appreciated. Thanks.


screen-shot-2018-09-24-at-111751-am.png


screen-shot-2018-09-24-at-111751-am.png
5 REPLIES 5

Re: Nifi Processor : Insert data into CLOB column

The ReplaceText puts the CLOB data directly into the content, which is why the error message occurs. You could use a PreparedStatement to get around this, but you very likely don't want to use an attribute as the CLOB is large.

The alternative is PutDatabaseRecord, you can provide a JsonTreeRecordReader (with schema) and it will generate the prepared statement but use the content to get the CLOB data rather than an attribute. If they won't know the schema (but it only contains primitive types) then they could use InferAvroSchema if necessary. However it is preferable to provide the schema directly via the Record Reader and/or a Schema Registry.

Re: Nifi Processor : Insert data into CLOB column

New Contributor
Hi Matt, I am new to nifi and i have similar usecase where i want to insert entire flow file content as clob value it throws error when i just use replace text and putSQL processors and i don't have any idea on how to handle it using putDatabaseRecord. Can you provide any example on this one.
Highlighted

Re: Nifi Processor : Insert data into CLOB column

New Contributor

Thanks Matt for your reply. The whole flow file content have to be inserted into database. is there way to configure JsonTreeRecordReader to indicate whole flow file content as a value in prepared statement?

Regards

Subbu

Re: Nifi Processor : Insert data into CLOB column

The entire flow file content is one "row" to be inserted into the database? You should still be able to use this, as of NiFi 1.7.0 the JsonTreeReader should recognize "one JSON per line" rather than expecting an array.

Re: Nifi Processor : Insert data into CLOB column

New Contributor

Can you upload or share the sample workflow for PutDatabaseRecord to insert the flow file content into database.

Thanks Subbu