Member since
02-01-2022
270
Posts
96
Kudos Received
59
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2027 | 06-12-2024 06:43 AM | |
2861 | 04-12-2024 06:05 AM | |
2061 | 12-07-2023 04:50 AM | |
1229 | 12-05-2023 06:22 AM | |
2141 | 11-28-2023 10:54 AM |
05-10-2023
05:42 AM
1 Kudo
@SandyClouds ^^ that is how you do it. One important thing to mention. That processor is not meant to be deployed in a set it up and forget it situation. CDC against another database requires a lot more technical attention to how the data is changing over time AFTER you grab data for first time. In my opinion, this processor is meant to be a conversation starter, or a way to take a 1 time shot of data source, where you may be watching it run, but not expecting it to run indefinitely and keep 2 systems synced.
... View more
05-10-2023
05:36 AM
@ryu Excellent questions here. I will address each below: So when I am asked like in an interview or something, they ask what CDH version do you use, and when i say CDH 7.2.16, the people interviewing me asks "are you sure?" Yes, you are sure. Even new versions of CDP, have CDH in the artifact filenames: 7.1.4-1.cdh7.1.4.p37.14288300 So is there a CDP version of CDH vs maybe on prem version of CDH Current versions of our platform are CDP 7.x. Previous versions are CDH 6.x and older. There is On Prem CDP called CDP Private Cloud Base. There is Public Cloud CDP called CDP Public Cloud in Aws, Azure, and GCP. Is there some difference in versioning between on prem or CDP etc? This is my favorite question. The differences between CDP on Prem and CDP in the cloud are going away quickly. It is part of our modern data architecture for workload movement from on prem to the cloud to have the least amount of differences as possible. As such, these 2 different form factors of CDP are getting closer and closer to parity.
... View more
05-10-2023
05:24 AM
@mwblee HDP is no longer supported platform. You cannot access the final HDP artifacts without cloudera subscription. I would highly recommend that you take a look at CDP and modern supported versions of original HDP components.
... View more
05-08-2023
07:24 AM
@sridharavulapat You should be able to find the required hive values in hive-site.xml file. You can get this file from Cloudera Manager. Additionally, use Cloudera Manager to download hive drivers and get the fully qualified jdbc url.
... View more
05-04-2023
05:56 AM
@anony I believe your solution is to put/keep Response Data as flowfile content. Then you can do QueryRecord or PartitionRecord to iterate through each "name" object in the Response Data array. Then downstream you are able to use EvalateJsonPath to get the json object values into attributes or flowfile content. For example name= $.name, nameID = $.Speciality[0].nameId, etc
... View more
05-04-2023
05:47 AM
1 Kudo
@ushasri You should check out NiFI Registry: https://nifi.apache.org/registry.html Using NiFi Registry you are able to version control flows during development. Using same Nifi Registry you can now deploy these flows to other environment(s) such as your Cloudera licensed version (HDF,CDF,CFM). Additionally, XML templates are going away. In modern versions of nifi, you should use the Create Flow Definition, and transfer these JSON definition files manually between environments and/or nifi developers. Other CI/CD DFLC (Data Flow Lifecycle) concepts for deploying flows across environments are using NIFI CLI API programmatically to do manual deployment operations. We also often seen some level of integration between Github and NiFi Registry.
... View more
05-04-2023
05:39 AM
1 Kudo
@ushasri Since you describe going from 1.20.0-RC1 to 1.15.3-RC1 it is safe to assume those features are not available in 1.15.3. ( Right click on canvas --> Enable all control services & disable all control services) are some of the newer UI features.
... View more
04-28-2023
05:13 AM
@FediMannoubi Get the required response data that you need from flowfile to an attribute with EvaluateJsonPath. Do this for user, password, and token. Once these data values are attributes, you can use them in the header, or write them back out to the flowfile content when appropriate.
... View more
04-28-2023
05:04 AM
@harry_12 I believe that users which are created in Ambari are for logging into ambari UI. If you need an SSH user for a particular host, you should create that user in the node(s) operating system directly.
... View more