1702
Posts
228
Kudos Received
80
Solutions
About
My expertise is not in hadoop but rather online communities, support and social media. Interests include: photography, travel, movies and watching sports.
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1700 | 06-29-2023 05:42 AM | |
1520 | 05-22-2023 07:03 AM | |
1142 | 05-22-2023 05:42 AM | |
1084 | 04-07-2023 07:03 AM | |
1639 | 06-29-2022 05:12 AM |
05-22-2023
06:40 AM
Congratulations on solving your issue @SandyClouds and thank you for posting the solution in case it can assist others.
... View more
05-22-2023
06:21 AM
Welcome to the community @HamTera. While you wait for a more knowledge member to respond, I'll ask if the size of the table is perhaps maxing out your memory? I did some looking around and that came up as a possibility in large tables. I hope it helps.
... View more
05-22-2023
05:56 AM
Welcome to the community @JeffTheMilkMan. While you wait for a more knowledgeable response, I'll provide a link to Parsing XML Logs With Nifi – Part 1 of 3 in case it can be of assistance.
... View more
05-22-2023
05:42 AM
Welcome to the community @ariajesus. While you wait on a more knowledge community member to respond, I'll leave you with this blog article I found in hopes it leads you to the correct path.
Cloudera Machine Learning’s APIv2 enables automated project lifecycle management, CI/CD integration, and more
... View more
05-19-2023
11:43 AM
Welcome to the community @ollyhank . I'm not an expert but I did find notes that version 1.2 included: New or Improved Processors, Controller Services, and Reporting Tasks
New Record oriented abstraction for reading/writing schema aware event streams from CSV, JSON, AVRO, Grok, and plaintext with easy extension for other formats/schemas
QueryRecord processor to execute SQL queries over a stream of records powered by Apache Calcite
ConvertRecord processor to efficiently transform records from a given schema and format into another schema and format
SplitRecord processor to efficiently split huge record bundles into configurable batch sizes for divide and conquer or protect downstream systems
Processors to efficiently stream Records into and out of Apache Kafka in a format and schema aware manner and which automatically handle achieving high throughput and full provenance
Controller Services for plugging into and managing data schemas (Avro Schema Registry, Hortonworks Schema Registry) that integrate nicely into the record readers and writers
Not sure if it is the issue here, but is something to consider while awaiting someone with more experience to reply.
... View more
05-15-2023
12:09 PM
1 Kudo
Congratulations on solving your issue @SandyClouds and thanks for taking the time to share it with the community.
... View more
05-15-2023
05:27 AM
@ryu If @steven-matison answered your question, please mark his reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
05-15-2023
05:25 AM
@orodriguesrenan Have you resolved your issue? If so please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
05-12-2023
12:54 PM
Welcome to the community @nymikek. Maybe @Shawn_Wang @vdy or @soychago can be of assistance.
... View more
05-12-2023
12:38 PM
Welcome to the community @Gutao. Perhaps @MattWho or @SAMSAL will be able to lead you in the right direction.
... View more