Member since
11-12-2018
2
Posts
0
Kudos Received
0
Solutions
11-12-2018
10:20 PM
@Matt Burgess
... View more
11-12-2018
09:48 PM
Hi all, Fairly new to Nifi, have read bunch of blogs/forums but was not still able to figure my issues out, so asking my questions here. Running Nifi 1.8.0 in a Docker Container, and trying to read from an Oracle DB and write to an Oracle DB (11g). Here is my target table definition : create table nifi_dim_bb_language ( pk integer, isocode varchar2(10), activeflag integer ); I've built 2 different dataflows and both are failing : A/ 1.QueryDatabaseTable > 2.ConvertAvroToJson > 3.ConvertJsonToSQL > 4.PutSQL Find enclosed the screenshots from every processor : 1.QueryDatabaseTable : I'm able to read the data from the source Oracle table (that only contains 3 records) properly and write them to a JSON file somewhere in a temp directory, here what's the JSON file
looks like. {"PK": "8796093087776", "ISOCODE": "fr", "ACTIVEFLAG": "1"} {"PK": "8796093055008", "ISOCODE": "en", "ACTIVEFLAG": "1"} {"PK": "8796256927776", "ISOCODE": "de", "ACTIVEFLAG": "1"} 2. ConvertAvroToJson : This works since I'm able to convert the data in Avro from the table into the JSON file above 3. ConvertJsonToSQL : This doesn't work even though I just connect the relationship 'sql 'only to the downstream PutSQL processor, I've played a bit with the properties, 'Translate Field Names' and so on, but same error. Error : None of the fields in the JSON map to the columns defined by the nifi_dim_bb_language table 4. PutSQL : Nothing gets written into the table. B/ 1.QueryDatabaseTable > 5. PutDatabaseRecord 1. QueryDatabaseTable : Same processor as above 5. PutDatabaseRecord : I defined an AvroReader, and Insert as 'Statement Type' but still not working. Warning : None of the fields in the record map to the columns defined by the nifi_dim_bb_language table: Thanks in advance for the help, Sid
... View more
Labels: