- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
LoginException: Algorithm HmacMD5 not available when login hdfs
- Labels:
-
HDFS
Created ‎08-22-2016 12:29 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
HI
we are working with security cdh5.7 cluster. I have a java programe to access the hdfs.
the Java Code:
conf.addResource(path1); UserGroupInformation.setConfiguration(conf); UserGroupInformation.loginUserFromKeytab("arch_onedata@OD.BETA", filePath)); fs = FileSystem.get(conf);
When I run the code with java main in eclipse, It is normally.
When I run the code with shell script call java main, the below exception hadppened:
15:17:02,718 DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]) 15:17:02,736 DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]) 15:17:02,743 DEBUG org.apache.hadoop.metrics2.lib.MutableMetricsFactory:42 - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, sampleName=Ops, type=DEFAULT, valueName=Time, value=[GetGroups]) 15:17:02,746 DEBUG org.apache.hadoop.metrics2.impl.MetricsSystemImpl:231 - UgiMetrics, User and group related metrics 15:17:03,244 DEBUG org.apache.hadoop.security.SecurityUtil:110 - Setting hadoop.security.token.service.use_ip to true 15:17:03,358 DEBUG org.apache.hadoop.security.Groups:301 - Creating new Groups object 15:17:03,546 DEBUG org.apache.hadoop.util.Shell:419 - setsid exited with exit code 0 15:17:03,601 DEBUG org.apache.hadoop.security.Groups:112 - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000 15:17:03,726 DEBUG org.apache.hadoop.security.UserGroupInformation:221 - hadoop login java.io.IOException: Login failure for arch_onedata@OD.BETA from keytab ../conf/arch_onedata.keytab: javax.security.auth.login.LoginException: Algorithm HmacMD5 not available at org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:962)
What happened? How to resolved this issue?
BR
Paul
Created ‎08-22-2016 07:30 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
cluster, vs. when you're using your developer machine's IDE?
Some older Linux distributions may have installed GNU Java version that's
very outdated and may not carry this specific class amongst others, and if
your app runs with such an older version then the error may be seen.
If you are using 'hadoop jar' to run the application, try exporting an
explicit JAVA_HOME that points to an Oracle JDK7 or JDK8 install path
before running the command to have it pick up the intended JVM.
Alternatively you may be missing the right crypo extensions installed on
your JDK.
Created ‎08-22-2016 07:30 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
cluster, vs. when you're using your developer machine's IDE?
Some older Linux distributions may have installed GNU Java version that's
very outdated and may not carry this specific class amongst others, and if
your app runs with such an older version then the error may be seen.
If you are using 'hadoop jar' to run the application, try exporting an
explicit JAVA_HOME that points to an Oracle JDK7 or JDK8 install path
before running the command to have it pick up the intended JVM.
Alternatively you may be missing the right crypo extensions installed on
your JDK.
Created ‎08-22-2016 11:24 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, Harsh,
the issue gone when i package the sunjce_provider.jar of JRE into lib folder.
Thanks
BR
Paul
