Member since
02-27-2020
173
Posts
42
Kudos Received
48
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1089 | 11-29-2023 01:16 PM | |
1171 | 10-27-2023 04:29 PM | |
1156 | 07-07-2023 10:20 AM | |
2518 | 03-21-2023 08:35 AM | |
921 | 01-25-2023 08:50 PM |
07-07-2021
04:30 PM
@TVR , in the link you shared, Step 11/12 shows how to setup .props file. Note that the property gg.handler.kafkaconnect.topicMappingTemplate is used, instead of TopicName. Check your .props file and see what is set there.
... View more
07-07-2021
04:04 PM
Hi @TVR , Next time please provide more details on what you have tried, what versions of Kafka and Oracle you are using, and what you are trying to achieve. In the meantime, it looks like TopicName is not a supported parameter since some version of Oracle GoldenGate. The alternative parameter is topicMappingTemplate. Please see Oracle documentation here: https://docs.oracle.com/goldengate/bd123110/gg-bd/GADBD/using-kafka-handler.htm#GADBD458 Hope that helps, Alex
... View more
07-06-2021
10:20 PM
Hi @Chakkara , What would you like the parser to do when it encounters an invalid date value? If this is a recurring problem, where multiple lines in your file have the same number of spaces to indicate missing date, then you can use ifElse in your replacement value command (something like this). Hope that helps, Alex
... View more
06-26-2021
07:03 AM
1 Kudo
Hi @roshanbi, There is no equivalent of oracle's EXTRACTVALUE in Kudu or Impala SQL. XML is not one of the supported formats. I think your approach of running the SQL with EXTRACTVALUE as part of the sqoop job is a good one. That way you are writing the fields you need into target Hive table and from there insert into Kudu. This would be the fastest path and would work if you are not planning to get more fields from your source XML table in the future. An alternative approach is to use Hive with a 3rd party SerDe for XML. Then let Hive do the conversion from XML format to something like Avro, and then create your Impala table based on that Avro data. This way is more involved, but gives you flexibility in terms of the schema. Hope that helps, Alex
... View more
06-18-2021
12:38 PM
Oh, I thought you had broken pipe in your Excel file, but this makes more sense. Thanks for clarifying. Do you need to use broken pipe delimiter on the output because the target process only accepts that character as the delimiter? Seems strange that you'd use broken pipe for a csv file. You can have ConvertExcelToCSVProcessor write out a simple comma-separated format. Then use a ReplaceTest processor to replace delimiter commas with your broken pipe hex code (see example here). Will that work for you?
... View more
06-18-2021
11:26 AM
The broken pipe character seems to be a special hex character that is longer than a single byte. The workaround here could be doing a ReplaceText processor on your Excel file first, replacing broken pipe character with something simple like a comma (,) and then doing conversion to CSV.
... View more
06-18-2021
11:12 AM
@Stefi , could you please provide an example of what one input record looks like and what you want the output CSV record to look like, and also what output you are getting right now? That will help us understand what the problem you are facing is.
... View more
06-18-2021
10:15 AM
Hi @Stefi , It could be as simple as setting the Include Trailing Delimiter parameter to False in your NiFi processor. Then NiFi should not include any delimiter at the end of each CSV record. See this documentation page for ConvertExcelToCSVProcessor. Hope this helps you out. Regards, Alex
... View more
06-17-2021
01:35 PM
Hi @drgenious , I believe this is possible by providing impala-shell with the following parameter: impala-shell -f /path/ --query_option='mem_limit=3gb' Let me know if that works. Regards, Alex
... View more
06-16-2021
10:13 AM
Hi @sakitha , What version of CDH/HDP/CDP are you using? If you are using Cloudera Manager (CM), please check the MirrorMaker parameter values for presence of any "}" characters. The error, as shown, indicates there is something wrong with Whitelist parameter or the like. Regards, Alex Akulov
... View more