Member since
05-24-2022
11
Posts
1
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1265 | 05-22-2023 07:47 AM | |
664 | 04-25-2023 04:07 AM | |
2973 | 07-06-2022 06:23 AM | |
1344 | 06-15-2022 08:07 AM |
05-22-2023
07:58 AM
It sounds like you are using OAuth, which means you ought to be using an OAuth2AccessTokenProvider controller service. See my screen shot, which shows that inside the InvokeHTTP there is a configuration line that lets you specify this controller service. This service automatically handles getting the token for you and embedding it in the header so your flowfile remains intact. This will greatly simplify your current flow because you can get rid of the map cache and that whole process cycle.
... View more
04-25-2023
04:07 AM
I used this Jolt Spec with the help of chatGPT and it worked [{ "operation": "shift", "spec": { "systemId": "systemId", "systemName": "systemName", "items.nid": "items.nid", "items.birth_date": "items.birth_date", "items.last_name": "items.last_name", "items.entity_type": "items.entity_type", "items.citizenship": "items.citizenship", "items.nationality": "items.nationality", "items.business_name": "items.business_name", "items.customer_type": "items.customer_type", "items.first_name": "items.first_name" } }, { "operation": "modify-overwrite-beta", "spec": { "items": { "*": "=ifEmpty(@(1,&),'')" } } }, { "operation": "shift", "spec": { "systemId": "systemId", "systemName": "systemName", "items": "items" } }]
... View more
11-29-2022
09:27 AM
Hi , I think after you split your csv you need to extract the values of both columns: status and client_id to attributes and then use in the ExecuteSQL processor, for that you need to : 1- convert the record to from CSV to JSON format using ConvertRecord Processor 2- use EvaluateJsonPath to extract both columns into defined attribute (dynamic properties). Make sure to set the Destination property to "flowfile-attribute". After that you can reference those attribute in the SQL query as ${status} & ${client_id}, assuming thats how you called the attributes in step 2. Another option if you dont want to use two processor , you can use ExtractText processor and provide regex to extract each value but you have to be careful how you define your regex for each value to make sure you are only pulling those values and nothing else. Hope that helps. If that answers your question please accept solution. Thanks
... View more
07-06-2022
06:23 AM
Thank you all for your replays, The fix was so simple: All I need to do to specify from which database schema I want to work with or have the lookup service connected to I just added the schema name before the table name like this : sandbox_s01.table_name
... View more
06-15-2022
08:07 AM
I used ConvertCharacterSet and changed the oncoding from UTF-16 to UTF-8 and that fixed for me Thanks
... View more
05-24-2022
06:02 AM
1 Kudo
@FediMannoubi Below is a basic approach to solve. Assuming both postgres tables are populated with rows per your example, your nifi flow would need to get the CSV (various ways to do that), once the contents of the csv are in a flowfile (i use GenerateFlowFile processor), you can use a RecordReader based processor to read the csv. This will allow you to write SQL against the flowfile with QueryRecord to get a single value. For example: SELECT city_name FROM FLOWFILE Next, in your flow you will need to get the city_name value into an attribute, i use EvaluateJsonPath. After that a ExecuteSQL processor and associated DBCP Connection pool to postgres. Then in ExecuteSQL your query is SELECT city_id FROM CITY WHERE city_name=${city_name} At the end of this flow you will have the city_name from csv, and city_id from postgres. You can now combine or use the further downstream to suit your needs. INSERT is done similarly, once you have the data in flowfiles, or attributes, using the same ExecuteSQL you write an insert instead. My test flow looks like this, but forgive the end, as I did not actually have a postgres database setup. You can find this sample flow [here]. I hope this gets you pointed in the right direction for reading csv and querying data from database.
... View more