Member since
01-09-2014
283
Posts
70
Kudos Received
50
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1726 | 06-19-2019 07:50 AM | |
2783 | 05-01-2019 08:07 AM | |
2832 | 04-10-2019 08:49 AM | |
2722 | 03-20-2019 09:30 AM | |
2371 | 01-23-2019 10:58 AM |
09-15-2017
09:16 AM
How far back do you have files in the dataDirs? There have been some instances where flume holds on to a specific event, that has been delivered, and the checkpoint still references it for some reason. If that is the case, you can stop flume, delete the files in the checkpoints directory, and force a replay of the events in the channel (note this may take a long time, depending on the size of all the logs in the dataDirs). You can set use-fast-replay=true to make it replay faster, but you'll need to increase heap size as well if you choose fast replay. -pd
... View more
09-15-2017
09:00 AM
The Flume file channel will remove old data log files after they have been delivered by the sink and are no longer needed (it always keeps at least two files, but all others would be eligible for deletion). What is the size of your channel? Is it increasing? Are your sinks able to keep up? You can't purge old data (because flume thinks it still needs it if it hasn't already deleted it), but you can control how much free space is on your disk with the minimumRequiredSpace channel property. You can also use the capacity property to restrict how many events can be in the channel. If your files in the dataDirs are not getting deleted, it's because the sinks haven't delivered events from those files. -pd
... View more
09-08-2017
08:51 AM
1 Kudo
Your best bet would to use sentry to provide the authorization with kerberos and AD. You can use sssd on the linux nodes to make the AD users and groups available to kafka: https://www.cloudera.com/documentation/enterprise/latest/topics/sg_auth_overview.html https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html -pd
... View more
09-08-2017
08:48 AM
1 Kudo
If you are using 0.0.0.0 for listeners, you'll also need an advertised.listeners safety valve to specify what host:port the clients should connect to, as they can't connect to 0.0.0.0. Generally, this is done with external/internal DNS. -pd
... View more
09-07-2017
11:22 AM
1 Kudo
You are correct, SASL_PLAINTEXT only provides authentication, not encryption. You'll want SASL_SSL if you need encrypted traffic as well. You can set inter.broker.protocol to a different value if you'd like to only encrypt client/server traffic, but if you leave that to inferred in CM, it will use whatever your listener value is set to. -pd
... View more
07-27-2017
08:29 AM
1 Kudo
This is indicating that your jaas.conf references a keytab that needs a password, or you are using ticket cache without doing a kinit before running this command. Confirm that you are able to connect to the cluster (hdfs dfs -ls /) from the command line first, and then check your jaas.conf based on this documentation: https://www.cloudera.com/documentation/kafka/latest/topics/kafka_security.html -pd
... View more
07-26-2017
07:52 PM
1 Kudo
What is your consumer.config file properties, does it have SASL_SSL as the protocol? Can you verify if its listening with SSL correctly: openssl s_client -connect svd0hdatn01:9093 -pd
... View more
07-26-2017
01:55 PM
1 Kudo
Connection refused seems to indicate that this host is not listening on port 9092: svd0hdatn01.<DOMAIN>:9092 You can see what ports kafka is listening on: ps -ef |grep kafka netstat -nap |grep <kafka pid> -pd
... View more
07-26-2017
12:35 PM
You need the DEBUG level set for the clients, not the brokers. This would be in the "Gateway Logging Threshold", or on the system where you are running the console commands by editing /etc/kafka/conf/tools-log4j.properties -pd
... View more
07-26-2017
12:05 PM
Try turning on DEBUG for the client commands. If you have a kafka gateway installed on that node, you can set the DEBUG level in the kafka service, otherwise, modify the /etc/kafka/conf/tools-log4j.properties to set the log level to DEBUG, and then run your producer or consumer. Some "retryable" errors when security is enabled keep the clients from properly connecting. -pd
... View more