Member since
02-28-2018
15
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1119 | 05-04-2018 07:40 AM |
05-04-2018
07:40 AM
Hello, It wasn't that the case. Anyway, I tried to use UpdateAttribute processor to change the filename to .CSV and it works in this way. Thank you for your time ! Kind regards, Stefan
... View more
05-02-2018
10:04 AM
We receive the .DAT files directly from the client which for the normal process that we have in place, ETL using ODI, we are open it in excel. But I wanted to know if there is a way to convert it or read it because we need to store the data of it in a table.
... View more
04-29-2018
03:52 PM
Dear Hortonworks, How can we read a .DAT file in Nifi or transform it in CSV or JSON ? Should I create a custom processor in Java or is a possibility by reading it directly in Nifi ? Thank you ! Kind regards, Stefan
... View more
- Tags:
- files
04-27-2018
09:11 AM
Hello all, I have a weird issue with this processor, PutSQL, precisely, I am trying to execute a SQL insert script but doesn't work, meaning, is only stay in running mode without doing nothing. In the logs I do not have any errors or warning messages. The connections to the DB are correct, verified with our DBA, even more, the DBA said that there are not any sessions open. I am saying that this is weird because before is was working with the same workflow. Also, there was no other modifications or applied patches at DB level which can stop this processor to insert data. Moreover, ExecuteSQL is working, I can extract data from the tables. Also the schema/user has all the privileges needed for this task and the tables are well created, some don't have even primary key in order to avoid some other issues which maybe Nifi didn't show them. Also, the ReplaceText provide a perfect script, I copy it and execute it via Toad and SQL Dev and the line is inserted. I am using locally Nifi 1.4.0 but also I tried with Nifi 1.5.0 and is the same behavior. I tried also from other colleague computer and is the same. Any advise would be great. Thank you ! Kind regards, Stefan
... View more
04-03-2018
08:52 AM
1 Kudo
Hello everyone ! I tried with Embedded schema, like Shu mentioned, and now everything is working perfectly. I feel so bad that it was so easy and simple, just few processors...anyway, thank you for your time and help, I really appreciate it ! Kind regards, Stefan
... View more
03-30-2018
02:44 PM
Hello @Shu, I didn't added here but of course I tried also with upper letters and still the same issue with ConvertRecord. With the EvaluateJsonPath, same issue again. I tried many other things but nothing worked. I was thinking that it will be really easy to output a file from a simple select query from a table but it seems that is not like that..
... View more
03-30-2018
12:12 PM
Json input: [ {
"INV_IDN" : "247048764",
"INV_NUMBER" : "181120060",
"USR_MDF" : "15/03/2018 08:34:00 by LDL",
"INVISSDAT" : "15/03/2018",
"INVCLICOD" : "FUNDQ",
"INVCLINAM" : "FUNDQUEST",
"INVCLI_REGCOUNTRY" : "FR",
"INVOICECLASS" : "CUS - assujetti in EU outside Lx",
"CORCLICOD" : "BNPAMFR",
"OFOCHINMB" : "20173748",
"BSNCTGDSC" : "Fund Data Management",
"BSNCTGCOD" : "LIS",
"BSNCTGDSPCOD" : "FDM",
"INVTYP" : "Credit Note",
"INVSTU" : "Validated",
"INVORGCOD" : "LU",
"CRYCOD" : "EUR",
"AMNWTHVAT" : "-7,543.23",
"AMNWTHVATINEUR" : "-7,543.23",
"MEDIAFEESINEUR" : "0",
"KCFEESINEUR" : "-7543.23"
} ] Json schema: {
"type": "record",
"name": "DnpReport",
"fields" : [
{"name": "inv_idn", "type": ["null", "string"]},
{"name": "inv_number", "type": ["null", "string"]},
{"name": "usr_mdf", "type": ["null", "string"]},
{"name": "invIssDat", "type": ["null", "string"]},
{"name": "invCliCod", "type": ["null", "string"]},
{"name": "invCliNam", "type": ["null", "string"]},
{"name": "invCli_RegCountry", "type": ["null", "string"]},
{"name": "invoiceClass", "type": ["null", "string"]},
{"name": "corCliCod", "type": ["null", "string"]},
{"name": "ofoChiNmb", "type": ["null", "string"]},
{"name": "bsnCtgDsc", "type": ["null", "string"]},
{"name": "bsnCtgCod", "type": ["null", "string"]},
{"name": "bsnCtgDspCod", "type": ["null", "string"]},
{"name": "invTyp", "type": ["null", "string"]},
{"name": "invStu", "type": ["null", "string"]},
{"name": "invOrgCod", "type": ["null", "string"]},
{"name": "cryCod", "type": ["null", "string"]},
{"name": "amnWthVat", "type": ["null", "string"]},
{"name": "amnWthVatInEur", "type": ["null", "string"]},
{"name": "mediaFeesInEur", "type": ["null", "string"]},
{"name": "kcFeesInEur", "type": ["null", "string"]}
]
}
... View more
03-30-2018
10:18 AM
Hello @Shu, @Rahul Soni, Do you have any other ideas ? Thank you ! Kind regards, Stefan
... View more
03-30-2018
08:10 AM
Hello @Rahul Soni, Since beggining I tried with null like default values but still I receive that error. Also I validated the JSON so is OK, here is my JSON: {
"type": "record",
"name": "DnpReport",
"fields" : [
{"name": "inv_idn", "type": ["null", "string"]},
{"name": "inv_number", "type": ["null", "string"]},
{"name": "usr_mdf", "type": ["null", "string"]},
{"name": "invIssDat", "type": ["null", "string"]},
{"name": "invCliCod", "type": ["null", "string"]},
{"name": "invCliNam", "type": ["null", "string"]},
{"name": "invCli_RegCountry", "type": ["null", "string"]},
{"name": "invoiceClass", "type": ["null", "string"]},
{"name": "corCliCod", "type": ["null", "string"]},
{"name": "ofoChiNmb", "type": ["null", "string"]},
{"name": "bsnCtgDsc", "type": ["null", "string"]},
{"name": "bsnCtgCod", "type": ["null", "string"]},
{"name": "bsnCtgDspCod", "type": ["null", "string"]},
{"name": "invTyp", "type": ["null", "string"]},
{"name": "invStu", "type": ["null", "string"]},
{"name": "invOrgCod", "type": ["null", "string"]},
{"name": "cryCod", "type": ["null", "string"]},
{"name": "amnWthVat", "type": ["null", "string"]},
{"name": "amnWthVatInEur", "type": ["null", "string"]},
{"name": "mediaFeesInEur", "type": ["null", "string"]},
{"name": "kcFeesInEur", "type": ["null", "string"]}
]
} Kind regards, Stefan
... View more
03-29-2018
04:11 PM
Hello @Shu, Unfortunately, both flows didn't work. First one, with ReplaceText having the properties you mentioned still outputted empty values even though the attributes are the same. For the second one with ConvertRecord, I receive this error: Failed to process StandardFlowFileRecord[...] will route to failure: java.lang.ArrayIndexOutOfBoundsException: -40
java.lang.ArrayIndexOutOfBoundsException: -40 I even try to convert the Avro to JSON and then to pass to ConvertRecord but still the same error... Any help will be welcomed ! Thank you ! Kind regards, Stefan
... View more
03-29-2018
11:31 AM
1 Kudo
replacetext.jpgDear all, I want to create a CSV file on a monthly bases by executing a SQL script. For this I am using ExecuteSQL processor and ConvertAvroToJSON, SplitJSON, EvaluateJsonPath, ReplaceText and PutFile. My problem is with ReplaceText processor. Provide a list with empty values even if the JSON values are there and are valid. Please see the attached files. I tried every recommended solution provided here on community but didn't help me. Kind regards, Stefan
... View more
Labels:
- Labels:
-
Apache NiFi
03-18-2018
12:40 PM
Hello @Matt Burgess, I am sorry for the bad name but yes, in the DB the name of the table is identical, I copied the name of the table from the script and I put it in the processor. I also drop the table and recreate it again. I tried also in another DB and the behavior is the same. Thank you for letting me know if you think to other solutions or workarounds. As you can see, I apply a JSON schema on a CSV file and the good records I want to insert them in an Oracle table. Kind regards, Stefan
... View more
03-16-2018
10:28 AM
Dear all, I set the property Unmatched Field Behavior to Fail and Unmatched Column Behavior to Warn in order to see the columns which are "bad". I start again the processor and the column where it says that Cannot map JSON field 'competitorname' to any column in the database is wrong because I took the name and I search in the table and is not any mismatching, the name are identical. I do not have any idea how to proceed further...I tried everything...could you please advise ? Thank you ! Kind regards, Stefan
... View more
03-16-2018
10:10 AM
Dear all, I am using the processor ConvertJsonToSQL in order to convert a plain JSON to SQL but it doesn't work saying: None of the fields in the JSON map to the columns defined by the MY_TABLE table. I tried also with the property Translate Field Names set to false because the columns are identical but also set to true. The DBCP connection is properly set and the Avro schema as well. Can you let me know what else should I try ? Thank you ! Kind regards,logic.jpgconvertjsontosql.jpg Stefan
... View more
Labels:
- Labels:
-
Apache NiFi
02-28-2018
02:00 PM
Dear all, We will soon migrate to Big data platform along with Hortonworks. Yey ! Currently, we have large projects based on ETL/ELT processes using ODI and my company wants to migrate them, or at least the new projects to be done via Nifi. So, here is my question: Knowing that Nifi is more like a data flow process, specifically for loading massive amount of data into Data lake for example, what are the features that Nifi make me able to replace an ETL/ELT process, precisely, concentrated on data transforming/checking part ? I know about SplitText, ExtractText and so on but I didn't see the specific filters... For example, how can I check each record of a CSV/XLSX file if, i.e., has the correct length, to check if that record is really a number not a varchar, how to check if column X has a value then column Z should has also a value ? For the lines that their records are good, we insert in a repository table and then use it to make a join with a materialize view in order to provide a CSV file with specific content. Here is my second question: How can we make such a join in Nifi and how to interact with the tables of a schema ? If you do not have time to answer, it will be great also a link which show how to solve this. I think those questions are in mind of the all developers which want to migrate from ETL/ELT to Nifi. Thank you for your time and help ! Stefan
... View more
- Tags:
- etl