Member since
11-16-2015
892
Posts
650
Kudos Received
245
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5669 | 02-22-2024 12:38 PM | |
1389 | 02-02-2023 07:07 AM | |
3087 | 12-07-2021 09:19 AM | |
4207 | 03-20-2020 12:34 PM | |
14165 | 01-27-2020 07:57 AM |
10-02-2019
02:11 AM
The prefetch (fetch value) option of jdbc can improve your perfomance a lot. You can add this options as dynamic option in the connection pool. My performance improves with a factor 20 when I raised it to 2000. JDBC uses a default value 10 when you don't specify a value.
... View more
08-21-2019
08:11 PM
Hi @mburgess, I was looking for a processor that would let me pull a query out of a field of the incoming flow file, instead of turning the entire flow file into a query. PutDatabaseRecord allowed to do that (which is when I discovered that the Hive connection does not support an explicit call of conn.commit()). I want to keep as much of the flow file intact as possible, is there a way to do that? Thank you.
... View more
08-01-2019
07:02 AM
The error message was not very clear but it works for HDF 1.9.2. Thanks a lot!
... View more
05-30-2019
04:07 PM
@Matt Burgess, thanks, I will try from my side. But which processors should I use if batch messages like below: {"id":"D1-V11-id11","provider":"D1-V12-provider","type":"D1-V13-type","parentsource":"D1-V14-parentsource","account":"D1-V15-account","region":"D1-V16-region",...},{"id":"D1-V21-21","provider":"D1-V22-provider","type":"D1-V23-type","parentsource":"D1-V24-parentsource","account":"D1-V25-account","region":"D1-V26-region",...}...,{"id":"D1-Vn1-idn1","provider":"D1-Vn2-provider","type":"D1-Vn3-type","parentsource":"D1-Vn4-parentsource","account": "D1-Vn5-account","region": "D1-Vn6-region",...} CR {"id":"D2-V11-id11","provider":"D2-V12-provider","type":"D2-V13-type","parentsource":"D2-V14-parentsource","account":"D2-V15-account","region":"D2-V16-region",...},{"id":"D2-V21-id21","provider":"D2-V22-provider","type":"D2-V23-type","parentsource":"D2-V24-parentsource","account":"D2-V25-account","region":"D2-V26-region",...}...,{"id":"D2-Vn1-idn1","provider":"D2-Vn2-provider","type":"D2-Vn3-type","parentsource":"D2-Vn4-parentsource","account": "D2-Vn5-account","region": "D2-Vn6-region",...} CR … CR {"id":"Dn-V11-id11","provider":"Dn-V12-provider","type":"Dn-V13-type","parentsource":"Dn-V14-parentsource","account":"Dn-V15-account","region":"Dn-V16-region",...},{"id":"Dn-V21-id21","provider":"Dn-V22-provider","type":"Dn-V23-type","parentsource":"Dn-V24-parentsource","account":"Dn-V25-account","region":"Dn-V26-region",...}...,{"id":"Dn-Vn1-idn1","provider":"Dn-Vn2-provider","type":"Dn-Vn3-type","parentsource":"Dn-Vn4-parentsource","account": "Dn-Vn5-account","region": "Dn-Vn6-region",...} This is not the standard JSON format. Any advice?
... View more
05-02-2019
10:39 PM
Hi Matt, I am not sure how to create the flow. In my case i have .xml files downloaded from http request and these files need to be uploaded into a clob column in oracle. I tried to define the sql.args.1.type as 2005 still no luck. can you share any example where we can upload huge flow file content into clob types.
... View more
04-10-2019
05:39 PM
I think this may be a bug in the NIFI JoltTransformRecord processor. What is the best way to register this bug?
... View more
04-09-2019
11:32 PM
Hi Matt. Works like a charm, thanks for your help!
... View more
04-02-2019
01:29 PM
You have specified the SQL Statement property but haven't supplied any values. I recommend replacing PutSQL with PutDatabaseRecord with a Statement Type of INSERT, this should do what you are trying to do.
... View more
03-21-2019
03:56 PM
Thanks @Matt. Actually Jolt Transformation is the the solution worked for me (reached few days ago, I'm looking for time to write my answer 🙂 ). In my flow I route on an attribute if I want the payload or not and then I apply this Jolt spec: [{
"operation": "shift",
"spec": {
"headers": "headers",
"info": "info",
"payLoad": "payLoad"
}
}, {
"operation": "default",
"spec": {
"_kafka": {
"offset": "${kafka.offset}",
"partition": "${kafka.partition}",
"topic": "${kafka.topic}",
"key": "${kafka.key}"
},
"_nifi": {
"flowfileuuid": "${uuid}"
}
}
} ] Thanks to confirm this is the right way: I'm a newer to NiFi.
... View more
06-07-2019
02:29 PM
Thanks,Matt! Sorry for a late answer 🙂
... View more