Member since
02-01-2022
274
Posts
97
Kudos Received
60
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
401 | 05-15-2025 05:45 AM | |
3396 | 06-12-2024 06:43 AM | |
5925 | 04-12-2024 06:05 AM | |
4055 | 12-07-2023 04:50 AM | |
2178 | 12-05-2023 06:22 AM |
10-13-2023
07:31 PM
@vaishaakb I noticed this same activity after deploying to the latest version of CM and after deploying parcels in my Lab cluster. I started getting P2P violations from my IDS and IPS. Is there any way to control the external p2p process? I've gone ahead and attached screen captures from my firewall. CDP - 7.1.9-1.cdh7.1.9.p0.44702451 - CM - 7.11.3 Example of the detection: All 5 of my nodes repeatedly trying to talk across the globe.
... View more
10-06-2023
04:39 PM
@kuhbrille Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. Thanks.
... View more
10-03-2023
08:27 AM
1 Kudo
@MWM Before sendEmail you need to add a DetectDuplicate processor. https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.12.1/org.apache.nifi.processors.standard.DetectDuplicate/ You can find a sample template here: https://github.com/steven-matison/NiFi-Templates/blob/master/DetectDuplicate_DistributedMapCache_Demo.xml
... View more
09-27-2023
07:43 AM
if the partition data exists like below: <s3:bucket>/<some_location>/<part_column>=<part_value>/<filename> you can create a external table by specifiying above location and run 'msck repair table <table_name> sync partitions' to sync partitions. validate the data by running some sample select statements. Once it's done you can create new external table with another bucket and run insert statement with dynamic partition. Ref - https://cwiki.apache.org/confluence/display/hive/dynamicpartitions
... View more
09-14-2023
12:46 AM
@tqiu Circling back to see if you got a chance to review our update? - V
... View more
09-13-2023
12:32 AM
Hi @rupeshh I am also facing a similar situation and wanted to clarify from you a few things. I see in the Configuration for the ExecuteStreamCommand you have referred to the venv inside the repository for the command path, does that mean you have the venv within the docker container? or are you referring to the one outside the docker container (in the host machine)?
... View more
09-07-2023
07:30 PM
So I copied only those nars which we use, and container could launch now. Though I have to remove few nars which were causing issues, like nifi-ssl-context-service-nar-1.10.0.nar. And now existing flows dont have issues with properties which are obsolete in 1.22.0 as 1.10.0 nars are used for those components. Thanks for all the inputs.
... View more
09-06-2023
08:04 AM
@Kiranq Why are you using ExecuteScript? You can setup a DBCP (DataBaseConnectionPool) controller service with your sql connection and driver file. Make sure that jdbc driver is found on all nifi hosts. Then, you are able to use any processors that reference a DBCP Controller Service. For example: ExecuteSql.
... View more
08-28-2023
10:46 AM
@john0123 have you been able to resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
08-28-2023
08:05 AM
@JohnnyRocks, as @steven-matison said, you should avoid linking so many ReplaceText. I am not quite sure I understood your flow exactly, but something tells me that before reaching ReplaceText, something is not properly configured in your NiFi Flow. First of all, when using the classic Java Data Format, MM will always transpose in a two digit month, meaning that month from 1 to 9 will be automatically appended with a leading zero. "dd" will do the same trick but for days. As I see in your post, you said that your CSV reader is configured to read the data as MM/dd/yy, which should be fine, but somehow something is missing here ---> How do you reach the format of dd/MM/yyyy? What I would personally try to do is to convert all those date values in the same format. So instead of all those ReplaceText, I would try to insert an UpdateRecord Processor, where I would define my RecordReader and my RecordWritter with the desired schemas (make sure that your column is type int with logicaly type date). Next, in that processor, I would change the Replacement Value Strategy into "Record Path Value" and I would press on + and add a new property. I would call it "/Launch_Date" (pay attention to the leading slash) and I would assign it the value " format( /Launch_Date, "dd/MM/yyyy", "Europe/Bucharest") " (or any other timezone you require -- if you require your data in UTC, just remove the coma and the timezone).
... View more