Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

DIGEST-MD5 Error Message

avatar
Contributor

I got this below error message in the logs on 2 datanodes. Hard restart fixed this problem but I wanted to what prompted that error.

If someone can through a light on it will greatly appreciate.

 

DataXceiver error processing unknown operation  src: /10.13.162.216:56080 dst: /10.13.161.101:1204
javax.security.sasl.SaslException: DIGEST-MD5: IO error acquiring password [Caused by org.apache.hadoop.hdfs.protocol.datatransfer.InvalidEncryptionKeyException: Can't re-compute encryption key for nonce, since the required block key (keyID=1900417437) doesn't exist.

3 REPLIES 3

avatar
Explorer

Hi Abdul,

 

Have You managed to resolve this issue? I am getting this error now but config is fine.

HMS delegation token store is set to org.apache.hadoop.hive.thrift.DBTokenStore but still I am getting this error.

Care to share the solution, if any?

avatar
Guru

Hi @rar59b ,

 

We are sorry to hear you are having some trouble.  Can you please open a new thread and provide us with some background (environment, product you are dealing with etc) of what you are trying to do and what is happening. The original thread is talking about datanode and your question seems to refer to HMS.

 

Thanks!

Li

Li Wang, Technical Solution Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:

Terms of Service

Community Guidelines

How to use the forum

avatar
Rising Star

Hi @Abdul @lwang @rar59b 

We have a know issue with the javax.security.sasl.SaslException: DIGEST-MD5: IO error acquiring password.

Which is not yet fixed in any current CDP versions.

https://issues.apache.org/jira/browse/HDFS-16332

Are we seeing any jobs failures when we have this issue?

What values we set for the below parameters.

dfs.data.transfer.protection =

hadoop.rpc.protection =

dfs.encrypt.data.transfer =