Created 10-25-2018 12:28 PM
Currently working on NiFi flow which reads a table from DB1(SQL Server1) using ExecuteSQL and put records in DB2(SQL Server2) using PutDatabasRecord processor. I'm getting error "None of the fields in the record map to the columns defined by the dbo.Employee table" while running the flow.
NiFi version : 1.7.1
SQL table schema: CREATE TABLE dbo.Employee ( EmpID bigint, FName nvarchar(100), LName nvarchar(100), DOB date, DOJ date )
ExecuteSQL property "Use Avro Logical Types" : false
PutDatabasRecord property "Record Reader" : AvroReader
AvroReader property "Schema Access Strategy" : Use 'Schema Text' Property
Schema Text: {"type":"record","name":"Employee","namespace":"dbo","fields":[{"name":"EmpID","type":["null","long"]},{"name":"FName","type":["null","string"]},{"name":"LName","type":["null","string"]},{"name":"DOB","type":["null","string"]},{"name":"DOJ","type":["null","string"]}]}
Please help me! Thanks in advance.
Created 10-25-2018 04:50 PM
Is "Translate Field Names" set to true in PutDatabaseRecord? If not try that. If so, try removing the "namespace":"dbo" from the schema. If that works, then it's probably a bug, as we should be trying to match on fullname but fall back to simple field names.
Created 10-26-2018 12:28 PM
Thanks for the reply Matt.
"Translate Field Names" property was set to false. Even after setting this property to true same issue. Also removed "namespace":"dbo" from the schema and tried. Still getting same error.
Below is my sample avro:
Objavro.schema¾{"type":"record","name":"NiFi_ExecuteSQL_Record","namespace":"any.data","fields":[{"name":"EmpID","type":["null","long"]},{"name":"FName","type":["null","string"]},{"name":"LName","type":["null","string"]},{"name":"DOB","type":["null","string"]},{"name":"DOJ","type":["null","string"]}]} äD¡ÅÚç.‚ÿ‘“¬1nLQTÐJohn Smith1956-10-032018-06-17äD¡ÅÚç.‚ÿ‘“¬1nLQ
Created 12-05-2018 07:39 AM
The error got resolved after adding missing processor "SplitAvro".
The new flow looks like ExecuteSQL --> SplitAvro --> PutDatabasRecord
Created 12-05-2018 06:30 PM
You shouldn't need SplitAvro in there, PutDatabaseRecord was designed to handle multiple records. I'll try to reproduce the issue and will keep you posted.
Created on 12-26-2018 11:28 AM - edited 08-17-2019 06:51 PM
@mburgess I encountered a similar problem while trying to ingest data from one Oracle Table to another. It does not work if I use SplitAvro either.
NiFi Version: 1.7.1
Source Table: CREATE TABLE IC_STAGE.TEMP_NIFI_1 ( ID_DATE VARCHAR2(500 BYTE), REC NUMBER )
Destination Table: CREATE TABLE IC_STAGE.TEMP_NIFI ( ID_DATE VARCHAR2(500 BYTE), REC NUMBER )
ExecuteSQL: SELECT * FROM IC_STAGE.TEMP_NIFI_1
PutDatabaseSQL Properties:
My schema, when i view the details of the queue looks like this:
{"type":"record","name":"NiFi_ExecuteSQL_Record","namespace":"any.data","fields":[{"name":"ID_DATE","type":["null","string"]},{"name":"REC","type":["null",{"type":"bytes","logicalType":"decimal","precision":10,"scale":0}]}]}
I got the same error even when I used "Schema Text" as my access strategy.
Created on 12-29-2018 11:39 AM - edited 08-17-2019 06:51 PM
I found the solution today. I modified my PutDatabaseRecord processor. I explicitly put the Oracle Schema table instead of putting it part of the Table Name.
I am now using only 2 processors, ExecuteSQL and PutDatabaseRecord. No need for the AvroSplit processor.
Created 08-20-2020 05:43 AM
Hi,
I am also trying to fetch data from table 1 and insert into table 2. Both data sources are Oracle. I am able to fetch data successfully but getting error in insert, in PutDataBaseRecord getting error 'none of the fields from record map are matching with table 2'
Created on 08-20-2020 05:50 AM - edited 08-20-2020 05:55 AM
@nkimani when I change table name to "EMPLOYEE" it gives error 'stream has been closed' and when I give table name 'employee' it gives error 'none of the fields are matching with record map with employee table '