Member since
01-27-2023
229
Posts
73
Kudos Received
45
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
628 | 02-23-2024 01:14 AM | |
796 | 01-26-2024 01:31 AM | |
563 | 11-22-2023 12:28 AM | |
1258 | 11-22-2023 12:10 AM | |
1459 | 11-06-2023 12:44 AM |
10-03-2023
06:45 AM
1 Kudo
@hkh, your post lacks all the minimum information required for anybody to help you out. Nevertheless, based on your error message, it seems that you are trying to connect to a PostgreSQL Database. Now, as you mentioned, you did an upgrade at the database level, meaning that some aspects have changed in terms of configurations. Your error message stats clearly what the issue is where you need to look to have it solved. Unfortunately, this has nothing to do with NiFi, but to how your DBA Team has configured your PostgreSQL Instance. Have a look here: https://stackoverflow.com/questions/67588076/why-do-i-get-error-type-10-authentication-not-supported-for-postgresql-13-even https://stackoverflow.com/questions/64210167/unable-to-connect-to-postgres-db-due-to-the-authentication-type-10-is-not-suppor
... View more
10-02-2023
01:28 AM
1 Kudo
@EddyChan, if I understood correctly, you want to set an already created Parameter Context in a Process Group, right? If so, you can do that using RestAPI with PUT as Request Method and the payload as: {
"revision": {
"clientId": "6df050ee-0002-1de6-27e5-05edebf761b0",
"version": 0
},
"disconnectedNodeAcknowledged": false,
"component": {
"id": "6dbf7e4e-849e-39ad-a2dd-dbe3f890a1d0",
"name": "test",
"comments": "",
"parameterContext": {
"id": "b2ce1579-0187-1000-ffff-ffffb8598d34"
},
"flowfileConcurrency": "UNBOUNDED",
"flowfileOutboundPolicy": "STREAM_WHEN_AVAILABLE",
"defaultFlowFileExpiration": "0 sec",
"defaultBackPressureObjectThreshold": "10000",
"defaultBackPressureDataSizeThreshold": "1 GB"
}
} What you need to know is the ID of the parameter context you would like to set in your process group and you are all done. Of course you will need to know the ID of the Process Group you are trying to modify as well.
... View more
09-26-2023
06:04 AM
@lafi_oussama, First of all, what is the compression format of the files you are trying to unzip? What error did you receive when using UnpackContent? Have you tried using CompressContent ( https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.12.1/org.apache.nifi.processors.standard.CompressContent/ ) You can configure the processed to either compress or decompress your files. You have multiple compression formats from which you can choose from.
... View more
09-25-2023
11:17 PM
@LKBand if you try the below select, what error do you encounter? Unfortunately I have no MS SQL database with which I could test so I can only somehow help you debug your situation, one step at a time. select
"06 QQQ" as 66_QQQ,
"11 JJJ" as 11_JJJ,
"12 KKK" as 12_KKK,
"13 JJJ" as 13_JJJ
from DB_TEST.dbo.FGRID
-- or with ` instead of "
-- or QQQ_66, sending the number at the back of the column, instead of the front. I assume that you have your Record Writer configured as Inherit Record Schema.
... View more
09-25-2023
10:33 AM
2 Kudos
@charliekat, this discussion is far from easy and most likely it will be a quite general one, with no specific answer to your question 🙂 There are plenty of things you could monitor and automate, starting from the CPU Usage, RAM Usage, CPU and RAM Temperature, I/O on your SSD, network packages and so on. Directly from NiFi you won't be able to perform many tasks, as it was not created for such things. However, if you are good enough with PowerShell (Gamer=Windows in 95% cases) you can create some scripts and execute them directly in NiFi. Now, depending on your script Output, you will call other scripts which will perform the desire action: like killing long running processes, or processes which are consuming to much CPU while you are gaming and so on. These actions only represent around 5-10% of what you can do. The other 90-95% are actually related to your hardware: GPU, OS, SSD, Mouse(DPI), Screen(resolution+frame rate), Keyboard, Ethernet Connection, CPU and the overall Cooling System, Windows Updates , Game settings, GPU Updates and so on .... and unfortunately, for these actions, NiFi won't be much of a friend 😞 Nevertheless, don't take my answer as a discouragement, but take it as an action to prove me wrong and go play with NiFi and make something amazing 🙂
... View more
09-25-2023
10:11 AM
1 Kudo
I managed to solve my issue using UpdateRecord with Literal Value replacement strategy. I have defined the following EL: ${field.value:toDate('dd-MMM-yy'):format('yyyy-MM-dd')} On the other hand, the avro schema remained the same, "type": "int" and "logicalType": "date". As for replacing the month letters, I have hard coded a very ugly IF-ELSE statement: ${field.value:contains('IAN'):ifElse(${field.value:replace('IAN','JAN')},${field.value:contains('IUN'):ifElse(${field.value:replace('IUN','JUN')},${field.value:contains('IUL'):ifElse(${field.value:replace('IUL','JUL')},${field.value})})}):toDate('dd-MMM-yy'):format('yyyy-MM-dd')} PS: this link helped me a lot: https://docs.oracle.com/javase/8/docs/api/java/text/SimpleDateFormat.html
... View more
09-25-2023
04:56 AM
1 Kudo
@BerniHacker, I did not even take into consideration the state 🙂 I figured you were trying to execute it for the first time and I assumed from the start that you had nothing saved in your state. Congrats on solving your issue.
... View more
09-25-2023
02:45 AM
Hi guys, So I have been struggling with a data conversion and I can't really figure out how to achieve what I am trying to achieve. I have a CSV File which comes into my flow as follows: COLUMN1 COLUMN2 COLUMN3 COLUMN4 COLUMN5 COLUMN6 MONTH
0182 Tel set W27 0 2200 31-IAN-22
0183 Apa cai W27 0 2200 30-SEP-22
0185 HDM set MT8 1 2200 28-FEB-22
0185 Apa alo MT8 0 2200 31-OCT-22
0186 HDM set HG5 1 2200 30-IUN-22
0188 Fus alo HG5 1 2200 30-APR-22 I am using afterwards an ConvertRecord to transform the CSV into an AVRO File, using the following schema: {
"type": "record",
"name": "nifiRecord",
"namespace": "org.apache.nifi",
"fields": [
{
"name": "COLUMN1",
"type": [
"string",
"null"
]
},
{
"name": "COLUMN2",
"type": [
"string",
"null"
]
},
{
"name": "COLUMN3",
"type": [
"string",
"null"
]
},
{
"name": "COLUMN4",
"type": [
"string",
"null"
]
},
{
"name": "COLUMN5",
"type": [
"string",
"null"
]
},
{
"name": "COLUMN6",
"type": [
"int",
"null"
]
},
{
"name": "MONTH",
"type": [
"string",
"null"
]
}
]
} Now, in the next step, I would like to transform the MONTH from STRING into DATE, so I could insert this value into a BigQuery Table (target column is DATE). For that, I am using an UpdateRecord Processor in which I tried several NiFi Expression Language tests, but neither work for me. Basically, the schema will have to change into: {
"name": "MONTH",
"type": [
"null",
{
"type": "int",
"logicalType": "date"
}
]
} Unfortunately, when trying to convert that string date into a normal date, i keep on encountering strange errors. What I am trying to have is 31-IAN-22 as 31-JAN-22 (or 31-01-22) in the generated AVRO File as an INT-DATE. As you can see, the month itself is not necessary written in english. I have tried several ELs: ${field.value:replace('SEP', '09'):toDate('dd-MM-yy'):format('dd-MMM-yy'):toDate('dd-MMM-yy'):toNumber()}
${field.value:toDate('dd-MM-yy'):format('dd-MMM-yy'):toDate('dd-MMM-yy'):toNumber()}
${field.value:toDate('dd-MM-yy'):format('dd-MMM-yy'):toNumber()}
${field.value:toDate('dd-MM-yy'):toNumber()} Every time I receives some strange errors: org.apache.nifi.attribute.expression.language.exception.IllegalAttributeException: Cannot parse attribute value as a date; date format: yyyy-mm-dd; attribute value: 30-SEP-23
java.lang.NullPointerException: null
org.apache.nifi.serialization.record.util.IllegalTypeConversionException: Failed Conversion of Field [MONTH] from String [1696021200000] to LocalDate Does anybody know how I could achieve this? Thanks 🙂
... View more
Labels:
- Labels:
-
Apache NiFi
09-25-2023
02:08 AM
1 Kudo
@BerniHacker, I have no experience with AWS, however I am using GCP .... so it should mostly be the same thing. In terms of GenerateTableFetch, you do not need to use Column for Value Partitioning in the same time as Maximum-value Columns ... I suggest you to only use Maximum-value Columns as you will get the same result but a little bit faster. Now, regarding your problem. I encountered something similar to what you are stating and if was related to connectivity and quotas set on the cloud environment. I set the GenerateTableFetch on Debug and used Execute Once to see what gets written in the Bulletin Board. In the same time I also opened the NiFi logs and used tail on nifi-app.log to see anything out of the ordinary. Once the quotas have been increased on the GCP side, I was able to extract around 1,000,000,000 rows without encountering any further issues. In the same time, I am also extracting data from local Oracle and MySQL Instances, totaling to more than 5B rows between 09:00 and 10:00AM, using a combination of GenerateTableFetch and ExecuteSQLRecord. Never have I encountered an problem with GenerateTableFetch due the size of the table. What you could also try is to execute the SELECT statement in your IDE and see how much it takes to get the results. Take note that NiFi might be a little slower, depending on where you have it configured as it could waste much time going through proxy and so on.
... View more
09-25-2023
01:56 AM
@Lalit1219, You could try using SplitContent to split your CSV into several other CSVs. Within the processor you can modify the Byte Sequence Format to Text and within the ByteSequence, you add the exact number of "_" you have in your CSV. This should split you CSV into smaller chunks and you can then use an RouteOnContent Processor to check for "Top 10 Power Consumers" and send only that file into further processing. Next you can try and use an ConvertRecord processor where you define a CSV Reader and a CSV Writer and you set ";" as delimiter.
... View more