Member since
06-26-2015
515
Posts
137
Kudos Received
114
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2092 | 09-20-2022 03:33 PM | |
5733 | 09-19-2022 04:47 PM | |
3104 | 09-11-2022 05:01 PM | |
3443 | 09-06-2022 02:23 PM | |
5442 | 09-06-2022 04:30 AM |
02-23-2022
12:41 AM
Hi, @RonMilne , A claim can contain one or more flowfiles and NiFi uses it to optimize storage for the content repository. It's totally possible that input and output flowfiles of a processor belong in the same claim. Please see this article for an explanation of what claims are and how they are managed. Cheers, André
... View more
02-22-2022
11:23 PM
1 Kudo
@Elsaa , If you can create a shell script that does what you need, you can use NiFi's ExecuteProcess to execute that shell script. André
... View more
02-22-2022
11:03 PM
@wert_1311 , Appericate your reply, regarding the first part of the question, I am trying to check the KDC server logs to troubleshoot an issue wherein an application is unable to renew its ticket, hence wanted to check the logs. Instead of the krb5.conf, please check the log location in the kdc.conf file. Assuming you're using a MIT KDC, this file is typically found at /var/kerberos/krb5kdc/kdc.conf on the KDC server Now for the second issue, the logs are not getting written in /var/log/hue/ (on the host where the roles are configured) for kt_renewer.log & runcpserver.log as the timestamp on both on them show Nov 24. There is ample Disk Space available. This is a bit strange. Either the service is in a funny state or the location of the logs were changed. Check the service configuration and look for the latest configured location. Also ensure you are looking at the right server, where the roles should be running. Cheers, André
... View more
02-22-2022
10:57 PM
@spserd , Would you be able to share the output of these commands: keytool -keystore /home/prabu/cyclone-nifi-keystore.jks -list -v
keytool -keystore /home/prabu/truststore.jks -list -v André
... View more
02-22-2022
10:39 PM
1 Kudo
@Mun , Instead of using the GetSFTP processor, you should use the pattern ListSFTP -> FetchSFTP. The ListSFTP processor is stateful and will remember which files were already fetched before. Cheers, André
... View more
02-22-2022
05:29 PM
Actually, an easier way to ignore the column name duplication and still process the columns correctly, would be to use a schema to describe your data. For example, say you have the following CSV: col_a,col_b,col_b
1,2,3
4,5,6 You can configure your CSVReader with the following: And the data will be processed correctly: HTH, André
... View more
02-22-2022
05:20 PM
@celestial1122 , If you could provide a sample of the input data you have and the expected output, it would help. But in general, once NiFi converts the CSV data into flowfile records, the record cannot have duplicated column names. If you want to keep both columns' values, you must rename one of them to a different name. You could do that, for example, using a ReplaceText processor. If you could provide examples, we could probably help with more ideas. Regards, André
... View more
02-22-2022
05:10 PM
Hi @wert_1311 , How do I check the location where Kerberos is writing logs? I checked the location which is mentioned in krb5.conf (default = FILE:/var/log/krb5libs.log kdc = FILE:/var/log/krb5kdc.log admin_server = FILE:/var/log/kadmind.log) however the log files mentioned in this location are empty. Am I checking an incorrect location? Can you explain what you are trying to verify/check? What sort of Kerberos information are you looking for? What kind of problem are you troubleshooting? Secondly logs on Hue Server & KT renewer is not getting updated / current. Do you mean that the logs have not been updated for a while? Is the disk full? What's the location of the logs that you're looking at? Have you double-checked that you are checking the logs in the correct hosts, where the Hue Server and KT Renewer is deployed to? Cheers, André
... View more
02-22-2022
03:50 PM
Hi, @spserd , I believe this issue is due to your Flink configuration, not NiFi. Your Flink job must be configured to use a keystore, besides the truststore. The keystore will be used by the Flink application to authenticate with the NiFi nodes. Can you please check (and share) the Flink job configuration? Cheers, André
... View more
02-22-2022
03:42 PM
Yes, it is possible: [
{
"operation": "shift",
"spec": {
"Data": {
"ARRAY_ONE": {
"*": {
"@(2,ID)": "[#2].ID",
"@(2,DATE)": "[#2].DATE",
"NAME": "[#2].NAME",
"details": {
"*": {
"address": "[#4].detail_&1_address"
}
},
"@(0,details)": "[#2].details"
}
}
}
}
}
]
... View more