Member since
06-26-2015
509
Posts
136
Kudos Received
114
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1284 | 09-20-2022 03:33 PM | |
3784 | 09-19-2022 04:47 PM | |
2249 | 09-11-2022 05:01 PM | |
2333 | 09-06-2022 02:23 PM | |
3630 | 09-06-2022 04:30 AM |
02-08-2022
02:44 PM
1 Kudo
Looking at the serialized data, that seems like the Java binary serialization protocol. It seems to me that the producer is simply writing the HashMap java object directly to Kafka, rather than using a proper serializer (Avro, JSON, String, etc.) You should look into modifying your producer so that you can properly deserialize the data that you're reading from Kafka.
... View more
02-08-2022
01:30 PM
Files that start with a dot are considered to be "hidden files" in Linux. To read those files with the ListFile processor, simply set Ignore Hidden Files to false.
... View more
02-08-2022
01:21 PM
What's the underlying database? Could you please share the full configuration of the PutDatabaseRecord processor?
... View more
02-08-2022
01:16 PM
Older versions of CDP, as well as the latest patches and releases, are only available for customers with an active Cloudera subscription. For more information on how to get a subscription please take a look at this: https://www.cloudera.com/products/pricing.html#private-cloud-services If you are looking for a trial version of CDP, you can use CDP 7.1.7.0 and Cloudera Manager, which can be downloaded from the following URLs: https://archive.cloudera.com/cdh7/ https://archive.cloudera.com/cm7/ Regards, André
... View more
02-08-2022
01:02 PM
The flow that you are trying to implement is not very efficient and it's hard to achieve what you want to do in that way. You should try to refactor it using record-based processors, which is much simpler and more efficient to handle a large number of records. Something like this: ExecuteSQLRecord -> QueryRecord -> PutDatabaseRecord ExecuteSQLRecord - Query and stream records from the source database QueryRecord - convert the records from the source to target format. (e.g. SELECT id, value FROM FLOWFILE) PutDatabaseRecord - insert the converted records into the target database. André
... View more
02-07-2022
07:57 PM
@an_dutra My guess is that it's a misconfiguration on your cluster. I just tested this on my Kafka cluster and once the certificate expires, if I try to connect to the cluster with a Kafka client I get the following exception: Caused by: sun.security.validator.ValidatorException: PKIX path validation failed: ... Caused by: java.security.cert.CertPathValidatorException: validity check failed ... Caused by: java.security.cert.CertificateExpiredException: NotAfter: Tue Feb 08 03:45:00 UTC 2022 The Kafka brokers will continue to run, though. However, if they are stopped and I try to start them again, they will fail to start with the same exception as the one above.
... View more
02-07-2022
02:36 PM
What @DigitalPlumber said. Also make sure that Match Requirement is set to "content must contain match", otherwise it won't work.
... View more
02-06-2022
09:11 PM
1 Kudo
You need to find out what's the serializer that's being used to write data to Kafka and use an associated deserializer to read those messages.
... View more
02-06-2022
07:56 PM
Without having more information, it seems to me that the content of your metadata file is not correct. It seems to be a SAML Assertion, rather than a SAML Metadata document.
... View more
02-06-2022
07:45 PM
Hi, Minh, How and where from did you generate the metadata.xml file? André
... View more