Member since
09-04-2019
62
Posts
17
Kudos Received
11
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
776 | 10-30-2023 06:50 AM | |
10434 | 02-27-2023 09:25 AM | |
1858 | 07-07-2022 09:17 AM | |
1762 | 01-26-2022 06:25 AM | |
2545 | 01-25-2022 06:19 AM |
07-07-2022
06:51 AM
I can only assume that at one point you upgraded to 1.16? If so you should: Do the below commands from within your databse_repository directory mv nifi-identity-providers.trace.db.migration_backup nifi-identity-providers.trace.db
mv nifi-identity-providers.mv.db.migration_backup nifi-identity-providers.mv.db
mv nifi-flow-audit.trace.db.migration_backup nifi-flow-audit.trace.db
mv nifi-flow-audit.mv.db.migration_backup nifi-flow-audit.mv.db And remove flow.json.gz in the same place you have your flow.xml.gz
... View more
02-05-2022
04:52 PM
Those parentheses on your search would be considered special characters ina. regular expression unless escaped. I can get this to work using this: Active: active \(running\)
... View more
02-05-2022
04:16 PM
1 Kudo
You can set the run schedule of the processor to cron driven [1] and give it your cron expression from there. Note that this cron is not the typical OS cron syntax and its based off quartz cron scheduler [2] To build a cron expression this is a good online tool to do so [3] [1] https://nifi.apache.org/docs/nifi-docs/html/user-guide.html#scheduling-strategy [2] http://www.quartz-scheduler.org/ [3] https://www.freeformatter.com/cron-expression-generator-quartz.html
... View more
01-26-2022
07:10 AM
Please also see this post: https://community.cloudera.com/t5/Support-Questions/Send-TCP-acknowledgement-in-NIFI/m-p/334438#M231765
... View more
01-26-2022
07:06 AM
Hi, Could you elaborate more? The ACK is part of the TCP protocol
... View more
01-26-2022
06:25 AM
Hello, Most likely because on your CSV Reader you have: Treat First Line as Header = false ( default ) Change that to true
... View more
01-25-2022
11:27 AM
1 Kudo
@zhangliang to accomplish that i would use UpdateRecord Since your data is csv and structured we can use record manipulation to accomplish this. First I would treat all your values as string and build an avro schema to use: {
"type":"record",
"name":"nifiRecord",
"namespace":"org.apache.nifi",
"fields":[
{"name":"test_a","type":["null","string"]},
{"name":"test_b","type":["null","string"]},
{"name":"test_c","type":["null","string"]},
{"name":"test_d","type":["null","string"]},
{"name":"test_e","type":["null","string"]}
]
} Then I would configure my UpdateRecord to use a CSV Reader and a CSV Writer I would configure the CSV Reader like this: Use schema text property Schema Text = Put your avro schema there Value Separator = | And the CSV Writer leave everything default except: Value Separator = | Finally the UpdateRecord processor will need 2 user fields. In this case we want to update the fields "test_c" and "test_d" And then we can use Record path manipulation and in particular for this use case the substringBefore function to only give us everything before the DOT "." Here is what you should configure: This will then take an input like this: test_a|test_b|test_c|test_d|test_e
a|b|3.0|4.0|5.0
a|b|3.0|4.0|5.0
a|b|3.0|4.0|5.0 and produce an output like this: test_a|test_b|test_c|test_d|test_e
a|b|3|4|5.0
a|b|3|4|5.0
a|b|3|4|5.0
... View more
01-25-2022
06:19 AM
1 Kudo
I wonder what java version you have on your Windows machine? NiFi supports java 8 and 11
... View more
01-24-2022
11:57 AM
Seems like you should follow this: https://nifi.apache.org/docs/nifi-docs/html/administration-guide.html#proxy_configuration
... View more