Support Questions

Find answers, ask questions, and share your expertise

Problems with Kafka command line utils after upgrade to HDP 2.3.4

avatar
Explorer

Hello,

I upgraded HDP from 2.3.2 to 2.3.4 the other day using the Ambari Upgrade Guide and Express Upgrade. The system only has ZooKeeper, Kafka and Ambari Metrics. This all initially seemed to work fine, but the problem occured when I tested the Kafka command line utils in /usr/hdp/current/kafka-broker/bin/:

  • kafka-console-consumer.sh
  • kafka-console-producer.sh
  • kafka-consumer-perf-test.sh
  • kafka-producer-perf-test.sh
  • kafka-topics.sh

I have used these commands a number of times prior to the 2.3.4 upgrade without any problems, but I am getting all kinds of different error messages after the upgrade.

$ ./bin/kafka-topics.sh --zookeeper zoo1:2181,zoo2:2181,zoo3:2181 --listlog4j:ERROR Could not read configuration file from URL [file:/usr/hdp/current/kafka-broker/bin/../config/tools-log4j.properties].java.io.FileNotFoundException: /usr/hdp/current/kafka-broker/bin/../config/tools-log4j.properties (No such file or directory)
$ ./bin/kafka-console-consumer.sh --zookeeper zoo1:2181,zoo2:2181,zoo3:2181 --max-messages 100 --topic perftest --from-beginning
log4j:ERROR Could not read configuration file from URL [file:/usr/hdp/current/kafka-broker/bin/../config/tools-log4j.properties].
java.io.FileNotFoundException: /usr/hdp/current/kafka-broker/bin/../config/tools-log4j.properties (No such file or directory)
        at java.io.FileInputStream.open(Native Method)
        at java.io.FileInputStream.<init>(FileInputStream.java:146)
        at java.io.FileInputStream.<init>(FileInputStream.java:101)
        at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
        at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
        at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:557)
        at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
        at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
        at org.apache.log4j.Logger.getLogger(Logger.java:117)
        at org.I0Itec.zkclient.ZkClient.<clinit>(ZkClient.java:63)
        at kafka.utils.ZkUtils$.createZkClient(ZkUtils.scala:83)
        at kafka.tools.ConsoleConsumer$.checkZkPathExists(ConsoleConsumer.scala:335)
        at kafka.tools.ConsoleConsumer$.checkZk(ConsoleConsumer.scala:83)
        at kafka.tools.ConsoleConsumer$.run(ConsoleConsumer.scala:62)
        at kafka.tools.ConsoleConsumer$.main(ConsoleConsumer.scala:47)
        at kafka.tools.ConsoleConsumer.main(ConsoleConsumer.scala)
log4j:ERROR Ignoring configuration file [file:/usr/hdp/current/kafka-broker/bin/../config/tools-log4j.properties].
log4j:WARN No appenders could be found for logger (org.I0Itec.zkclient.ZkClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
No brokers found in ZK.
$ ./bin/kafka-consumer-perf-test.sh --topic perftest --messages 100 --zookeeper zoo1:2181,zoo2:2181,zoo3:2181
log4j:ERROR Could not read configuration file from URL [file:/usr/hdp/current/kafka-broker/bin/../config/tools-log4j.properties].
java.io.FileNotFoundException: /usr/hdp/current/kafka-broker/bin/../config/tools-log4j.properties (No such file or directory)
        at java.io.FileInputStream.open(Native Method)
        at java.io.FileInputStream.<init>(FileInputStream.java:146)
        at java.io.FileInputStream.<init>(FileInputStream.java:101)
        at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
        at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
        at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:557)
        at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
        at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
        at org.apache.log4j.Logger.getLogger(Logger.java:117)
        at kafka.tools.ConsumerPerformance$.<init>(ConsumerPerformance.scala:44)
        at kafka.tools.ConsumerPerformance$.<clinit>(ConsumerPerformance.scala)
        at kafka.tools.ConsumerPerformance.main(ConsumerPerformance.scala)
log4j:ERROR Ignoring configuration file [file:/usr/hdp/current/kafka-broker/bin/../config/tools-log4j.properties].
log4j:WARN No appenders could be found for logger (kafka.tools.ConsumerPerformance$).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
start.time, end.time, data.consumed.in.MB, MB.sec, data.consumed.in.nMsg, nMsg.sec
Exception in thread "main" org.apache.kafka.common.KafkaException: File /usr/hdp/current/kafka-broker/config/kafka_client_jaas.confcannot be read.
        at org.apache.kafka.common.security.JaasUtils.isZkSecurityEnabled(JaasUtils.java:95)
        at kafka.consumer.ZookeeperConsumerConnector.connectZk(ZookeeperConsumerConnector.scala:200)
        at kafka.consumer.ZookeeperConsumerConnector.<init>(ZookeeperConsumerConnector.scala:145)
        at kafka.consumer.ZookeeperConsumerConnector.<init>(ZookeeperConsumerConnector.scala:162)
        at kafka.consumer.Consumer$.create(ConsumerConnector.scala:109)
        at kafka.tools.ConsumerPerformance$.main(ConsumerPerformance.scala:72)
        at kafka.tools.ConsumerPerformance.main(ConsumerPerformance.scala)

So, clearly either something is missing or wrong with the code.

Looking at the upgrade guide for HDP 2.3.2, it has a "manual upgrade" section (which is no longer present in the guide for 2.3.4), and section "4.2.4.17. Upgrade Kafka" tells you to..

cp /etc/kafka/conf.saved/tools-log4j.properties /etc/kafka/conf/
cp /etc/kafka/conf.saved/test-log4j.properties /etc/kafka/conf/
cp /etc/kafka/conf.saved/zookeeper.properties /etc/kafka/conf/
cp /etc/kafka/conf.saved/producer.properties /etc/kafka/conf/
cp /etc/kafka/conf.saved/consumer.properties /etc/kafka/conf/

These 5 config files are not present in my /etc/kafka/conf/ after Express Upgrade to HDP 2.3.4.

So, I started looking for them. I still had them available in /etc/kafka/2.3.2.0-2950/0/, but also /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/. I tested both versions of the config files (diff is minor between them).

This resolves the tools-log4j.properties errors I was getting, but not the kafka_client_jaas.conf and "No brokers found in ZK." errors.

Next step was to grab /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/kafka_client_jaas.conf as well, but this only changes my JAAS error:

$ ./bin/kafka-console-consumer.sh --zookeeper zoo1:2181,zoo2:2181,zoo3:2181 --max-messages 100 --topic perftest --from-beginning
[2016-01-21 14:59:27,797] WARN Could not login: the client is being asked for a password, but the Zookeeper client code does not currently support obtaining a password from the user. Make sure that the client is configured to use a ticket cache (using the JAAS configuration setting 'useTicketCache=true)' and restart the client. If you still get this message after that, the TGT in the ticket cache has expired and must be manually refreshed. To do so, first determine if you are using a password or a keytab. If the former, run kinit in a Unix shell in the environment of the user who is running this Zookeeper client using the command 'kinit <princ>' (where <princ> is the name of the client's Kerberos principal). If the latter, do 'kinit -k -t <keytab> <princ>' (where <princ> is the name of the Kerberos principal, and <keytab> is the location of the keytab file). After manually refreshing your cache, restart this client. If you continue to see this message after manually refreshing your cache, ensure that your KDC host's clock is in sync with this host's clock. (org.apache.zookeeper.client.ZooKeeperSaslClient)
[2016-01-21 14:59:27,801] WARN SASL configuration failed: javax.security.auth.login.LoginException: No password provided Will continue connection to Zookeeper server without SASL authentication, if Zookeeper server allows it. (org.apache.zookeeper.ClientCnxn)
No brokers found in ZK.

So, both SASL error and the "No brokers found in ZK." remains.

And that is how far I've gotten so far. Anyone else run into this problem after upgrading (or at some other point)?

1 ACCEPTED SOLUTION

avatar
Explorer

Either due to a bug in HDP 2.3.4 Express Upgrade process, or our specific upgrade, some configuration files had not been copied to /etc/kafka/conf/ like they seemingly should:

cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/tools-log4j.properties /etc/kafka/conf/
cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/test-log4j.properties /etc/kafka/conf/
cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/zookeeper.properties /etc/kafka/conf/
cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/producer.properties /etc/kafka/conf/
cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/consumer.properties /etc/kafka/conf/

This resolved the log error when running the commands (and possibly some other issues), but the SASL/kerberos issue remained.

# grep java.security.auth.login.config /usr/hdp/current/kafka-broker/bin/kafka-*
/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-consumer-offset-checker.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-consumer-perf-test.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-producer-perf-test.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-replay-log-producer.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-simple-consumer-shell.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"

It turns out some of the command line scripts enables jaas/sasl/kerberos. Using the "--security-protocol PLAINTEXT" option did not help either. The simple solution was to just comment out these export lines.

After these minor fixes, everything was working as expected again.

View solution in original post

10 REPLIES 10

avatar
Master Mentor

@Anders Synstad

are you sure your upgrade completed successfully? Make sure you complete the upgrade on all nodes, go to admin page and confirm that every node is completed. Otherwise follow directions here. The final step of every upgrade is to run hdp-select tool, Ambari does it for you in an automated upgrade process but in manual you have to do it yourself. This tool symlinks client directiories (/usr/hdp/current with real bits directories /usr/hdp/version/). You may not have done that. Also it's a good practice to run service checks after every install/upgrade.

avatar
Explorer

This doesn't seem to be the case. All service checks return OK post install, and everything appears to be linked to the new version like it should.

ps: the doc you linked is for Ambari 2.1.x and is outdated. HDP 2.3.4 comes with Ambari 2.2.0 and has a different upgrade guide without any of the manual steps described (however, I assume most of the information still applies).

avatar
Master Mentor

@Anders Synstad I'm glad you use sanity checks looking at common advice, I only meant to highlight that you need to check output of hdp-select tool to make sure it's complete.

avatar
Master Guru

@Anders Synstad Something went wrong with your upgrade. So your Kafka is kerberized, right? You'll need both kafka_client_jaas.conf and kafka_server_jaas.conf. You can find details here. It's a good idea to open those files and do a sanity check of their contents. Once you have them right, and under /etc/kafka/conf, restart all brokers. Then create a new topic and make sure console producer and consumer are running. For kerberized Kafka after kinit do this:

export CLIENT_JVMFLAGS="-Djava.security.auth.login.config=/etc/kafka/conf/kafka_client_jaas.conf"

and run producer and consumer with "--security-protocol SASL_PLAINTEXT" option. Details are in chapters 5 and 6 of that document. After that you can try performance tests.

avatar
Explorer

This seems like a very plausible explanation. The upgrade, trying to be helpful, seems to have kerberized my Kafka install. I'll dig into it and see if I can un-kerberize it.

But I guess the kerberization of Kafka doesn't explain the other missing config files.

avatar
Master Guru

jaas files are needed only for keberized Kafka, that's why I thought it's kerberized. Can you open the *.properties files and see what's there.

avatar
Explorer

After inspecting the configuration files more closely, my Kafka install does not actually appear to be kerberized.

It seems 'export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"' is added to some of the scripts (example: kafka-console-consumer.sh).

So, the consumer and producer tests scripts are kerberized, not Kafka (HDP 2.3.2 as well). The --security-protocol PLAINTEXT (default according to the help) does not help either.

Removing the export line, and copying the missing configs seems to solve the problem.

avatar
Master Mentor

@Anders Synstad can you convert your comment to reply, this is essential for future and we can mark this thread as closed

avatar
Explorer

Either due to a bug in HDP 2.3.4 Express Upgrade process, or our specific upgrade, some configuration files had not been copied to /etc/kafka/conf/ like they seemingly should:

cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/tools-log4j.properties /etc/kafka/conf/
cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/test-log4j.properties /etc/kafka/conf/
cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/zookeeper.properties /etc/kafka/conf/
cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/producer.properties /etc/kafka/conf/
cp /usr/hdp/2.3.4.0-3485/etc/kafka/conf.default/consumer.properties /etc/kafka/conf/

This resolved the log error when running the commands (and possibly some other issues), but the SASL/kerberos issue remained.

# grep java.security.auth.login.config /usr/hdp/current/kafka-broker/bin/kafka-*
/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-consumer-offset-checker.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-consumer-perf-test.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-producer-perf-test.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-replay-log-producer.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"
/usr/hdp/current/kafka-broker/bin/kafka-simple-consumer-shell.sh:export KAFKA_CLIENT_KERBEROS_PARAMS="-Djava.security.auth.login.config=/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf"

It turns out some of the command line scripts enables jaas/sasl/kerberos. Using the "--security-protocol PLAINTEXT" option did not help either. The simple solution was to just comment out these export lines.

After these minor fixes, everything was working as expected again.