Member since
06-26-2015
509
Posts
136
Kudos Received
114
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1316 | 09-20-2022 03:33 PM | |
3856 | 09-19-2022 04:47 PM | |
2270 | 09-11-2022 05:01 PM | |
2357 | 09-06-2022 02:23 PM | |
3704 | 09-06-2022 04:30 AM |
09-04-2022
05:29 PM
@syntax_ , Please try running this command: xxd message.avro Then you can copy and paste the output here. Cheers, André
... View more
09-02-2022
03:18 AM
Ok. Did you try the ldap configuration I mentioned above? Cheers André
... View more
09-02-2022
03:16 AM
Can you please send me that file in a private message. Copy and paste won't work 🙂 Cheers, André
... View more
09-01-2022
10:25 PM
@hebamahmoud , Try enabling the Kudu Service as a dependency for the Impala service, as shown below. When you do that Impala will use the selected Kudu service as the default service whenever you use a Kudu table that does not explicitly sets the Kudu masters. If you don't set the property above you can still use Kudu from Impala but when you create the table you have to specify the Kudu masters through the TBLPROPERTIES clause. Cheers, André
... View more
09-01-2022
10:15 PM
@ajignacio , What's the output of the command below if you run it from the same machine where NiFi is running? openssl s_client -connect ldap.dev.abcde:389 I know you are not using TLS, but the command above can still give us some useful information. Cheers, André
... View more
09-01-2022
10:09 PM
I actually wanted to have a look at the binary Avro data that is in Kafka, not the deserialized content. Something like this: kafka-console-consumer --from-beginning --bootstrap-server admin:9092 --topic pbs_jobs --max-messages 1 > message.avro Cheers, André
... View more
09-01-2022
02:25 AM
Would you be able to save one of these files in a file and share it with me?
... View more
08-31-2022
06:32 PM
@syntax_ , I believe you have a schema that you can use to parse your Avro data, right? Instead of using ConsumeKafka, use the ConsumeKafkaRecord processor. In that processor specify an Record Reader of type AvroReader and provide the correct schema so that the reader can properly deserialize your data. If you want to convert the data for JSON, you can then specify a JsonRecordSetWriter as the Record Writer for that processor, so that the output flowfiles will be in that format and you'll be able to inspect the content of the queues. Cheers, André
... View more
08-30-2022
03:47 PM
@Rgupta7 , Check your flows for flowfiles stuck in queues (.e.g. maybe you have connections to stopped processors or dead-end funnels and messages stay there indefinitely). Flowfiles are physically stored in bigger files that contain many flowfiles. If even one of those flowfiles are referenced by any queues, those files are never removed from disk. Cheers, André
... View more