Member since
01-27-2023
229
Posts
74
Kudos Received
45
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1758 | 02-23-2024 01:14 AM | |
| 2293 | 01-26-2024 01:31 AM | |
| 1431 | 11-22-2023 12:28 AM | |
| 3582 | 11-22-2023 12:10 AM | |
| 3660 | 11-06-2023 12:44 AM |
04-11-2023
08:55 AM
To be really honest, I do not know how you could not remove those double quotes. Maybe you can try escaping the quotes with a backslash and also wrap the entire command in single quotes. Something like: (I might have added a few extra single quotes by mistake but you get the point) '-X POST -H '\''Content-Type:application/json'\'' -d '\''{"name": "Apple AirPods", "data": {"color": "white", "generation": "3rd", "price": 135}}'\'' https://api.restful-api.dev/objects' In theory, the double quotes around the JSON string are escaped with a backslash and the entire command is wrapped in single quotes. The single quotes allow the command to be passed to the ExecuteStreamCommand processor as a single argument, which ensures that the quotes are preserved. PS: I did not test this and I am not 100% it will work, but at least it is worth a shot until somebody with more experience can guide your further.
... View more
04-11-2023
08:20 AM
Hi @Jaimin7, I am not quite sure how your SMTP Server is configured, but everywhere I have implemented the PutEmail processors, I needed 5 mandatory properties: 1) SMTP Hostname 2) SMTP Port 3) SMTP Username (even though in NiFi it is not a mandatory field, it was a mandatory field for the STMP to allow the connection) 4) SMTP Password (even though in NiFi it is not a mandatory field, it was a mandatory field for the STMP to allow the connection) 5) From. Having all these fields configured, I was able to send email from NiFi without any restrictions. Of course, I made sure that the firewall connection between NiFi and the SMTP Server is allowing such connections 🙂
... View more
04-11-2023
03:03 AM
Hi there, So, I need your experience with a flow I have been struggling with. My Flow is as follows: ConsumeKafka --> ConvertRecord --> ConvertJSONToSQL --> PutSQL. I am extracting some data out of some kafka brokers, which come in the following format: {"Column1": "1e39c17d25cb420e8e720aa4b1ae005e","TimeStamp": "2023-04-10T10:43:15.794241429+03:00","Column3": "some_string","Column4": false,"Column5": "some_string","Column6": "some_string","Column7": false,"Column8": "some_string","Column9": "","Column10": 0,"Column11": 0,"Column12": "","Column13": 0,"Column14": 137,"Column15": "","Column16": 0,"Column17": 0,"Column18": "","Column19": 138,"Column20": 1550,"Column21": "some_string","Column22": ""} I am using afterwards an ConverRecord, with an RecordReader = JsonTreeReader (with default configuration: Infer Schema and Root Node) and an RecordWriter = JsonRecordSetWriter (with default configuration: Inherit Record Schema and Pretty print json = false). The data comes out in JSON format, as expected, without any modifications. Next, I am going into a ConvertJSONToSQL. Here, I have defined a JDBC Connection Pool, set the Statement Type = INSERT and set the Table Name = my_table. All the other configurations remained the same. The data comes out as attributes and they look fine. sql.args.2.type --> 93 sql.args.2.value --> 2023-04-10T10:43:15.794241429+03:00 Now, due to the fact that I have to add the data in a PostgreSQL Database, I am using PutSQL to save the data. As the column I am inserting that value into is of format "timestamp without time zone", I have added one UpdateAttribute Processors to modify the date. The first UpdateAttribute is defined with a property named "sql.args.2.value" and has the following value "${sql.args.2.value:substringBefore('T')} ${sql.args.2.value:substringAfter('T'):substringBefore('+')}". The data which now comes out is: " 2023-04-10 10:43:15.794241429". When inserting the data into PostgreSQL, I have a different value as the one present into the attribute: "2023-04-19 15:20:36.429". To further debug the flow, I have added another UpdateAttribute and defined it with a property named "a3" and has the following value ${sql.args.2.value:toDate("yyyy-MM-dd HH:mm:ss.SSS"):toNumber():format("yyyy-MM-dd HH:mm:ss.SSS")}. I have added another property as well, named "a4" and has the following value "${sql.args.2.value:toDate("yyyy-MM-dd HH:mm:ss.SSS"):toNumber()}". Now, the results are pretty strange: a3 --> 2023-04-19 15:20:36.429 a4 --> 1681906836429. Basically the problems comes when translating that date into the datetime type. Does anybody know why it behaves like this? I am certain that this is not a bug, but something I am not using correctly 🙂 Thank you 🙂
... View more
Labels:
- Labels:
-
Apache NiFi
04-11-2023
01:17 AM
@Jame1979, It would really help if you could post a screenshot of each of the processors, as the problem might be related to some of your configurations. You might be doing something with UpdateAttribute which will affect the behavior of FetchS3. As I can see, you have a terminated thread and a running thread, meaning that something happened there. I had a similar issue and I solved it by restarting the NiFi Cluster. There was a problem in the back-end which translated into threads being generated but actually not performing any action. After the restart, everything went back to normal and I started collecting data using Fetch :). You could also set the FetchS3 to DEBUG to see if any logs are being generated, which might point you into the right direction.
... View more
04-10-2023
01:22 PM
Well, besides the fact that you are calling a different API endpoint as you previously did, yea, it could be that the payload is the one messing up your command. What I would try is to use an UpdateAttribute to generate the curl command as an atrribute and see how it gets constructed. I would then take the command and execute it on my server to see if it works — if it gets constructed correctly. Otherwise, you will see where the error is located and you can try and solve it.
... View more
04-10-2023
11:38 AM
Well to be honest it is pretty hard to debug something you have no clue about 😄 From your command I see that you are using the value for "referer" as a variable taken out of an attribute. I assume that the value is correct. Now, based on my experience with ExecuteStreamCommand, I would give each parameter after a semi colon.... something like: ( it is worth a shot, otherwise you will need to test each parameter until you find the faulty one) -X;POST;-H;referer:${Referer};-H;'Content-Type: application/json';-d;'{\"newTopics\": [{\"name\":\"testing123\",\"numPartitions\":3,\"replicationFactor\":3}], \"allTopicNames\":[\"testing123\"]}';--negotiate;-u;:;-b;/tmp/cookiejar.txt;-c;/tmp/cookiejar.txt
... View more
04-10-2023
10:08 AM
1 Kudo
Well I am not quite certain what SMM is and what sort of API Calls it accepts, but as far as it goes to the received answer, it could have an endless list of root causes. Error 415 - Unsupported Media type indicates that your server refused the accept that request, due to the payload format. This problem might be due to the content type or the content encoding, as well as the data directly. You could first of all try from a linux machine and see if that curl command truly works. Another option would be to use Postman to test the API command and see the results. Last but not least, try maybe using InvokeHTTP instead of ExecuteStreamCommand, as InvokeHTTP is more tailored for such actions. InvokeHTTP: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.20.0/org.apache.nifi.processors.standard.InvokeHTTP/index.html
... View more
04-05-2023
12:15 AM
1 Kudo
Hi @saquibsk, Ok, undestood. What I can tell you is that what you are trying to achieve is not impossible but is not easy either. I believe in the power of a community but in the same time I believe that the scope of the community is to help you with solutions and advice to your problems and not to do the work for you 🙂 I assume that you already started a flow so lets start from there, what you developed, why it is not good and what you are missing from it. From my point of view, there are two options here: 1) You modify all of your Processors to write the Bulletin Level at INFO (or Debug) and afterwards, using an InvokeHTTP, you can access your Bulletin Board with the REST API and extract your information. This it not highly recommended as you will generate very large logs files. Besides that, your certificates must be generated accordingly, otherwise you will get some errors. 2) At each step in your flow, you write a message to LogMessage, which will save your data into nifi-app.log. Here you can define in LogMessage exactly what you want to write. Afterwards, you can create a separate flow, using a TailFile Processor and extract everything you want from your nifi-app.log File. Here you will have to extract only the information you require 🙂 Once you have extracted your data, either from your Bulletin Board or from your LogFile, you can build the SQL Statement for inserting the data into the DB.
... View more
04-04-2023
06:43 AM
1 Kudo
hi @noncitizen, I have tested the following configuration on my local machine and it seems to be working fine. You can give it a try and let me know if it works: Using GenerateFlowFile, I created a FlowFile having the attribute "absolute.path" defined as "/home/test/testFolder/". Now, you already have a flow which generates you all these information so you can skip this step. In ExecuteStreamCommand, I have defined the following: Command Path = bash Command Arguments = -c;mv ${absolute.path:append('*.log')} ${absolute.path:append('logfiles')} The result is that all the files with the extension .log from within my folder /home/test/testFolder/ have been copied into /home/test/testFolder/logfiles. I need to mention though the following: - the absolute path should end with "/", otherwise, using your expression you will have something like /path/input*.log.
... View more
04-04-2023
06:17 AM
hi @saquibsk, What are you trying to achieve with this post, to be more precisely? You want to learn how to generate something in NiFi or you want somebody to help you write an SQL Query in your Data Warehouse? Please be so kind and provide some more details with regards to your problem.
... View more