Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Apache NiFi PutDatabasRecord processor error: None of the fields in the record map to the columns defined by the dbo.Employee table

avatar
Explorer

Currently working on NiFi flow which reads a table from DB1(SQL Server1) using ExecuteSQL and put records in DB2(SQL Server2) using PutDatabasRecord processor. I'm getting error "None of the fields in the record map to the columns defined by the dbo.Employee table" while running the flow.

NiFi version : 1.7.1

SQL table schema: CREATE TABLE dbo.Employee ( EmpID bigint, FName nvarchar(100), LName nvarchar(100), DOB date, DOJ date )

ExecuteSQL property "Use Avro Logical Types" : false

PutDatabasRecord property "Record Reader" : AvroReader

AvroReader property "Schema Access Strategy" : Use 'Schema Text' Property

Schema Text: {"type":"record","name":"Employee","namespace":"dbo","fields":[{"name":"EmpID","type":["null","long"]},{"name":"FName","type":["null","string"]},{"name":"LName","type":["null","string"]},{"name":"DOB","type":["null","string"]},{"name":"DOJ","type":["null","string"]}]}

Please help me! Thanks in advance.

8 REPLIES 8

avatar
Master Guru

Is "Translate Field Names" set to true in PutDatabaseRecord? If not try that. If so, try removing the "namespace":"dbo" from the schema. If that works, then it's probably a bug, as we should be trying to match on fullname but fall back to simple field names.

avatar
Explorer

Thanks for the reply Matt.

"Translate Field Names" property was set to false. Even after setting this property to true same issue. Also removed "namespace":"dbo" from the schema and tried. Still getting same error.

Below is my sample avro:

Objavro.schema¾{"type":"record","name":"NiFi_ExecuteSQL_Record","namespace":"any.data","fields":[{"name":"EmpID","type":["null","long"]},{"name":"FName","type":["null","string"]},{"name":"LName","type":["null","string"]},{"name":"DOB","type":["null","string"]},{"name":"DOJ","type":["null","string"]}]} äD¡ÅÚç.‚ÿ‘“¬1nLQTÐ John Smith1956-10-032018-06-17äD¡ÅÚç.‚ÿ‘“¬1nLQ

avatar
Explorer

The error got resolved after adding missing processor "SplitAvro".

The new flow looks like ExecuteSQL --> SplitAvro --> PutDatabasRecord

avatar
Master Guru

You shouldn't need SplitAvro in there, PutDatabaseRecord was designed to handle multiple records. I'll try to reproduce the issue and will keep you posted.

avatar
Explorer

@mburgess I encountered a similar problem while trying to ingest data from one Oracle Table to another. It does not work if I use SplitAvro either.

NiFi Version: 1.7.1

Source Table: CREATE TABLE IC_STAGE.TEMP_NIFI_1 ( ID_DATE VARCHAR2(500 BYTE), REC NUMBER )

Destination Table: CREATE TABLE IC_STAGE.TEMP_NIFI ( ID_DATE VARCHAR2(500 BYTE), REC NUMBER )

ExecuteSQL: SELECT * FROM IC_STAGE.TEMP_NIFI_1

PutDatabaseSQL Properties:

  • Record Reader: Avro Reader
    • Schema Access Strategy: Use Embedded Avro Schema
97444-nifi-image-1.png97445-nifi-image-2.png

My schema, when i view the details of the queue looks like this:

{"type":"record","name":"NiFi_ExecuteSQL_Record","namespace":"any.data","fields":[{"name":"ID_DATE","type":["null","string"]},{"name":"REC","type":["null",{"type":"bytes","logicalType":"decimal","precision":10,"scale":0}]}]}

I got the same error even when I used "Schema Text" as my access strategy.

avatar
Explorer

I found the solution today. I modified my PutDatabaseRecord processor. I explicitly put the Oracle Schema table instead of putting it part of the Table Name.

97481-nifi-image-3.png

I am now using only 2 processors, ExecuteSQL and PutDatabaseRecord. No need for the AvroSplit processor.

97482-nifi-image-4.png

avatar
New Contributor

Hi,

 

I am also trying to fetch data from table 1 and insert into table 2. Both data sources are Oracle. I am able to fetch data successfully but getting error in insert, in PutDataBaseRecord getting error 'none of the fields from record map are matching with table 2'  

avatar
New Contributor

@nkimani  when I change table name to "EMPLOYEE" it gives error 'stream has been closed' and when I give table name 'employee' it gives error 'none of the fields are matching with record map with employee table '