Support Questions

Find answers, ask questions, and share your expertise

Connect to secure hadoop cluster from non-cluster host

avatar
Contributor

Hi,

 

It is possible to access secure cluster from host that not part of the cluster as service (hdfs/yarn/etc) gateway?

 

I've download client configuration from cluster and configure krb5.conf. kinit is succeeded but still unable to connect to hdfs.

 

$ klist
Ticket cache: FILE:/tmp/krb5cc_501
Default principal: user01@DEVELOPMENT.COM

Valid starting     Expires            Service principal
12/22/15 14:57:07  12/23/15 00:57:11  krbtgt/DEVELOPMENT.COM@DEVELOPMENT.COM
        renew until 12/29/15 14:57:07

$ export HADOOP_OPTS="-Dsun.security.krb5.debug=true -Djavax.net.debug=ssl"
$ hadoop fs -ls /
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>>KinitOptions cache name is /tmp/krb5cc_501
>>>DEBUG <CCacheInputStream>  client principal is user01@DEVELOPMENT.COM
>>>DEBUG <CCacheInputStream> server principal is krbtgt/DEVELOPMENT.COM@DEVELOPMENT.COM
>>>DEBUG <CCacheInputStream> key type: 23
>>>DEBUG <CCacheInputStream> auth time: Tue Dec 22 14:57:11 WIB 2015
>>>DEBUG <CCacheInputStream> start time: Tue Dec 22 14:57:07 WIB 2015
>>>DEBUG <CCacheInputStream> end time: Wed Dec 23 00:57:11 WIB 2015
>>>DEBUG <CCacheInputStream> renew_till time: Tue Dec 29 14:57:07 WIB 2015
>>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL; PRE_AUTH;
ls: failure to login
1 ACCEPTED SOLUTION

avatar
Contributor

Hi Harsh,


@Harsh J wrote:
Could you re-run the command also with the below env set?

$ export HADOOP_ROOT_LOGGER=TRACE,console
$ export HADOOP_OPTS="-Dsun.security.krb5.debug=true -Djavax.net.debug=ssl"
$ hadoop fs -ls /


Here is the result:

16/01/04 17:42:07 DEBUG util.Shell: setsid exited with exit code 0
16/01/04 17:42:07 DEBUG conf.Configuration: parsing URL jar:file:/app/hadoop-2.6.0-cdh5.4.5/share/hadoop/common/hadoop-common-2.6.0-cdh5.4.5.jar!/core-default.xml
16/01/04 17:42:07 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@4ae69619
16/01/04 17:42:07 DEBUG conf.Configuration: parsing URL file:/home/user01/yarn-conf/core-site.xml
16/01/04 17:42:07 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@30317bdd
16/01/04 17:42:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
16/01/04 17:42:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
16/01/04 17:42:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
16/01/04 17:42:08 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
16/01/04 17:42:08 DEBUG security.Groups:  Creating new Groups object
16/01/04 17:42:08 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
>>>KinitOptions cache name is /tmp/krb5cc_501
>>>DEBUG <CCacheInputStream>  client principal is user01@DEVELOPMENT.COM
>>>DEBUG <CCacheInputStream> server principal is krbtgt/DEVELOPMENT.COM@DEVELOPMENT.COM
>>>DEBUG <CCacheInputStream> key type: 23
>>>DEBUG <CCacheInputStream> auth time: Mon Jan 04 17:41:23 WIB 2016
>>>DEBUG <CCacheInputStream> start time: Mon Jan 04 17:41:06 WIB 2016
>>>DEBUG <CCacheInputStream> end time: Tue Jan 05 03:41:23 WIB 2016
>>>DEBUG <CCacheInputStream> renew_till time: Mon Jan 11 17:41:06 WIB 2016
>>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL; PRE_AUTH;
16/01/04 17:42:08 DEBUG security.UserGroupInformation: hadoop login
16/01/04 17:42:08 DEBUG security.UserGroupInformation: hadoop login commit
16/01/04 17:42:08 DEBUG security.UserGroupInformation: using kerberos user:user01@DEVELOPMENT.COM
16/01/04 17:42:08 DEBUG security.UserGroupInformation: Using user: "user01@DEVELOPMENT.COM" with name user01@DEVELOPMENT.COM
16/01/04 17:42:08 DEBUG security.UserGroupInformation: failure to login
javax.security.auth.login.LoginException: java.lang.IllegalArgumentException: Illegal principal name user01@DEVELOPMENT.COM: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to user01@DEVELOPMENT.COM
        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:199)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:762)
        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:203)
        at javax.security.auth.login.LoginContext$4.run(LoginContext.java:690)
        at javax.security.auth.login.LoginContext$4.run(LoginContext.java:688)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:687)
        at javax.security.auth.login.LoginContext.login(LoginContext.java:596)
        at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:812)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
        at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2753)
        at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2745)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2611)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:354)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
        at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
        at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224)
        at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207)
        at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:100)
        at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
        at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
Caused by: java.lang.IllegalArgumentException: Illegal principal name user01@DEVELOPMENT.COM: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to user01@DEVELOPMENT.COM
        at org.apache.hadoop.security.User.<init>(User.java:50)
        at org.apache.hadoop.security.User.<init>(User.java:43)
        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:197)
        ... 30 more
Caused by: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to user01@DEVELOPMENT.COM
        at org.apache.hadoop.security.authentication.util.KerberosName.getShortName(KerberosName.java:389)
        at org.apache.hadoop.security.User.<init>(User.java:48)
        ... 32 more
ls: failure to login

From logs above shows that kerberos client config is still pointed to default /etc/krb5.conf. I use different path by exporting env variable KRB5_CONFIG.

 

After edit /etc/krb5.conf to the proper value, its now works properly. I can browse HDFS and submit job to YARN.

 


@Harsh J wrote:

Is this remote host also carrying the Unlimited JCE policy jars under its JDK, so it may use AES-256 if that is in use?

I use JDK from cloudera: jdk1.7.0_67-cloudera

 

Thank you very much Harsh.

View solution in original post

2 REPLIES 2

avatar
Mentor
Could you re-run the command also with the below env set?

$ export HADOOP_ROOT_LOGGER=TRACE,console
$ export HADOOP_OPTS="-Dsun.security.krb5.debug=true -Djavax.net.debug=ssl"
$ hadoop fs -ls /

Is this remote host also carrying the Unlimited JCE policy jars under its JDK, so it may use AES-256 if that is in use?

avatar
Contributor

Hi Harsh,


@Harsh J wrote:
Could you re-run the command also with the below env set?

$ export HADOOP_ROOT_LOGGER=TRACE,console
$ export HADOOP_OPTS="-Dsun.security.krb5.debug=true -Djavax.net.debug=ssl"
$ hadoop fs -ls /


Here is the result:

16/01/04 17:42:07 DEBUG util.Shell: setsid exited with exit code 0
16/01/04 17:42:07 DEBUG conf.Configuration: parsing URL jar:file:/app/hadoop-2.6.0-cdh5.4.5/share/hadoop/common/hadoop-common-2.6.0-cdh5.4.5.jar!/core-default.xml
16/01/04 17:42:07 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@4ae69619
16/01/04 17:42:07 DEBUG conf.Configuration: parsing URL file:/home/user01/yarn-conf/core-site.xml
16/01/04 17:42:07 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@30317bdd
16/01/04 17:42:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
16/01/04 17:42:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
16/01/04 17:42:08 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[GetGroups], about=, type=DEFAULT, always=false, sampleName=Ops)
16/01/04 17:42:08 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
16/01/04 17:42:08 DEBUG security.Groups:  Creating new Groups object
16/01/04 17:42:08 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
>>>KinitOptions cache name is /tmp/krb5cc_501
>>>DEBUG <CCacheInputStream>  client principal is user01@DEVELOPMENT.COM
>>>DEBUG <CCacheInputStream> server principal is krbtgt/DEVELOPMENT.COM@DEVELOPMENT.COM
>>>DEBUG <CCacheInputStream> key type: 23
>>>DEBUG <CCacheInputStream> auth time: Mon Jan 04 17:41:23 WIB 2016
>>>DEBUG <CCacheInputStream> start time: Mon Jan 04 17:41:06 WIB 2016
>>>DEBUG <CCacheInputStream> end time: Tue Jan 05 03:41:23 WIB 2016
>>>DEBUG <CCacheInputStream> renew_till time: Mon Jan 11 17:41:06 WIB 2016
>>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL; PRE_AUTH;
16/01/04 17:42:08 DEBUG security.UserGroupInformation: hadoop login
16/01/04 17:42:08 DEBUG security.UserGroupInformation: hadoop login commit
16/01/04 17:42:08 DEBUG security.UserGroupInformation: using kerberos user:user01@DEVELOPMENT.COM
16/01/04 17:42:08 DEBUG security.UserGroupInformation: Using user: "user01@DEVELOPMENT.COM" with name user01@DEVELOPMENT.COM
16/01/04 17:42:08 DEBUG security.UserGroupInformation: failure to login
javax.security.auth.login.LoginException: java.lang.IllegalArgumentException: Illegal principal name user01@DEVELOPMENT.COM: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to user01@DEVELOPMENT.COM
        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:199)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:762)
        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:203)
        at javax.security.auth.login.LoginContext$4.run(LoginContext.java:690)
        at javax.security.auth.login.LoginContext$4.run(LoginContext.java:688)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:687)
        at javax.security.auth.login.LoginContext.login(LoginContext.java:596)
        at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:812)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
        at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2753)
        at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2745)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2611)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:354)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
        at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
        at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224)
        at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207)
        at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:100)
        at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
        at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
Caused by: java.lang.IllegalArgumentException: Illegal principal name user01@DEVELOPMENT.COM: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to user01@DEVELOPMENT.COM
        at org.apache.hadoop.security.User.<init>(User.java:50)
        at org.apache.hadoop.security.User.<init>(User.java:43)
        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:197)
        ... 30 more
Caused by: org.apache.hadoop.security.authentication.util.KerberosName$NoMatchingRule: No rules applied to user01@DEVELOPMENT.COM
        at org.apache.hadoop.security.authentication.util.KerberosName.getShortName(KerberosName.java:389)
        at org.apache.hadoop.security.User.<init>(User.java:48)
        ... 32 more
ls: failure to login

From logs above shows that kerberos client config is still pointed to default /etc/krb5.conf. I use different path by exporting env variable KRB5_CONFIG.

 

After edit /etc/krb5.conf to the proper value, its now works properly. I can browse HDFS and submit job to YARN.

 


@Harsh J wrote:

Is this remote host also carrying the Unlimited JCE policy jars under its JDK, so it may use AES-256 if that is in use?

I use JDK from cloudera: jdk1.7.0_67-cloudera

 

Thank you very much Harsh.