Member since
07-19-2018
613
Posts
101
Kudos Received
117
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 5091 | 01-11-2021 05:54 AM | |
| 3421 | 01-11-2021 05:52 AM | |
| 8788 | 01-08-2021 05:23 AM | |
| 8381 | 01-04-2021 04:08 AM | |
| 36678 | 12-18-2020 05:42 AM |
12-29-2019
06:56 AM
1 Kudo
Regarding: "file_attachment" : "[\"attachment1.docx\",\"attachment2.docx\",\"attachment3.docx\",\"attachment4.docx\"]" The processor translates the object as a string. You will need to take some action on it to prepare it as you want it to be downstream. Assuming you are working with content of FlowFile, the processor you need is ReplaceText and the syntax is: ${'$1':unescapeJson()} To dial it even further so its not a string at all: ${'$1':unescapeJson():replace('"[','['):replace(']"',']')} Here is same in the ReplaceText processor properties: If you are operating on an attribute, you can do the similar action with UpdateAttribute: ${attributeName:unescapeJson()} If this reply answers your question please mark it as a Solution.
... View more
12-29-2019
06:42 AM
1 Kudo
@CJoe36 couple of things you need to do here: Get the value of the column into an attribute. Create routes based on the value of attribute. For #1 there are a few ways. First, you can create a flow that uses a CSV Reader and a known schema. Using this you can translate and parse the columns. This requires multiple processors and CSVReader Controller Service. Two, you can just use ExtractText with regex (one processor). here is example of ExtractText to get a Quantity and SKU from an inventory CSV. Notice the regex codes used with the commas, and the () indicating which field maps to the attribute defined (sku or qty). qty: .*?,,.*?,,(\d+)$ sku: (.*?),,.*?,,.*? If your CSV is 10 columns, and #5 is your value, you probably want something like this: ,,,,(.*?),,,,, For #2 you want to use RouteOnAttribute Processor with routes defined using NiFi Expression Language. You define a route, then when defined you can chose it for downstream of RouteOnAttribute. Anything else will go to unmatched. Here is an example: And Routed: If this reply helps answer your question, please mark it as a Solution.
... View more
12-29-2019
06:26 AM
@KUnew Sounds like you have nifi and Elasticsearch installed. For a Proof Of Concept the single node simple installs should work fine. There are no further requirements. A Cloudera or Hortonworks cluster is not required. However, if you intend to have multiple instances of Nifi and one or more Elasticsearch Master and Data nodes I would highly recommend using Hortonworks HDF with Ambari to manage the NiFi and Elasticsearch installation & configuration. I have created the Elk Management Pack for HDF 3.x which allows install of 6.3.2 or 7.4.2 Elasticsearch, Logstash, Kibana, Filebeat, & Metricbeat. You can find it on my GitHub: https://github.com/steven-dfheinz/dfhz_elk_mpack I also have some articles here specifically about how to install the Elk Management Pack. Just search up elasticsearch here in the community Articles section. It is very easy to install ambari, the management pack, and further install NiFi and ELK. If this reply answers your question, please mark it as a Solution.
... View more
12-27-2019
07:57 AM
I have a use case scenario coming up next week where I will need to process some Parquet to CSV. I created a few demos with nifi 1.10 but unfortunately it is not possible to use this version of NiFi on the customers environment. I know I can still satisfy the actions I need to take with 1.9 without the new 1.10 features for Parquet, but I know these new features would be more efficient.
Is it possible to drop the required artifacts into 1.9 and achieve Parquet Reader functionality?
... View more
Labels:
- Labels:
-
Apache NiFi
12-27-2019
06:37 AM
@vikram_shinde the Escape Character is a setting in the CSVReader : CSVReader 1.9.0.3.4.1.1-4 my nifi version is: 1.9. Escape Character is a basic of CSV, as it is required to ignore quotes inside of the "enclosed by quotes". So if you feed the escaped string to the CSV reader it will output the correct values w/o the escape character.
... View more
12-27-2019
06:30 AM
@ankita_pise Did you try appending the json string of values to the Url and like you did in curl? I believe this may work if it works in Curl. Let us know the results.
... View more
12-27-2019
06:19 AM
Hello @saivenkatg55 can you please provide more details about your Nifi environment? Without this kind of information we cannot be very helpful: How many Nodes? How much Ram? How many Cores? What are the min:max values for Memory in the NiFi Configuration? How many Processors are running? That said, restarting Nifi often, on a heavily used system is a good practice. This is however an indication of the system needing additional attention for Performance Tuning. Items that could need attention are memory settings, garbage collection settings, increasing resources (nodes/ram/cores) and optimization of the flow itself.
... View more
12-24-2019
05:00 AM
@vikram_shinde can't you just use an escape character in the csv? "ICUA","01/22/2019","08:48:18",394846,"HAVE YOU REMOVED THE KEY?","YES---select \"Accept Response\" and continue with the remove","","","1" If you can't change the source, a simple ReplaceText should resolve changing "quoted text" to \"quoted text\". Ofcourse you would need to handle this downstream, but for example the csv reader if configured properly will ignore the escape characters when mapping that csv column to the schema. If this answer helps with your issue, please mark it as Accepted Solution.
... View more
12-24-2019
04:43 AM
1 Kudo
This is a very basic use case scenario for NiFi. I would recommend that once you get the file into NiFi you split it line by line. Once you have the log file splits, then you do the match logic on each single line. Route the lines you want down stream and handle them accordingly. There are many ways to do this, and the fun part of NiFi is discovering what works best for you. Here is a NiFi Template I have that checks log files: https://github.com/steven-dfheinz/NiFi-Templates/blob/master/Get_File_Demo.xml If this answers helps solve your issue, please make it as Accepted Solution.
... View more
12-24-2019
04:31 AM
Couple things I notice. First per documentation, your hostname should be an FQDN and mapped to the correct IP. Not mapping a proper hostname to the main server IP will cause confusion between short hostname, long hostname, localhost, localhost IP, etc. Second, after fixing that, you need to create permissions from your nifi FQDN hostname to your mysql server with a user that has the proper privelages. A simple test to check network connectivity from NiFi to mysql host would be using telnet to port 3306. If the telnet connects, then you know for sure you need to create the user@'host' permission grants as follows: CREATE USER 'user'@'%' IDENTIFIED BY 'password'; GRANT ALL PRIVILEGES ON *.* TO 'user'@'%' WITH GRANT OPTION; FLUSH PRIVILEGES; Depending on your setup the % (wildcard) may or may not work. If required replace the wildcard with the NiFI FQDN hostname. Last suggestion: tail -f /var/log/nifi/nifi-app.log while testing. You will see much more info about errors than you see in the NiFi UI. You can also set each processor error level to get more information in the UI. If this answer helps solve your issue, please accept it as the solution.
... View more