Created 05-21-2017 11:34 PM
I’m trying to access HDFS on behalf of another user. I’m trying this with the following application
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import org.apache.hadoop.security.UserGroupInformation; import org.apache.log4j.Logger; import org.apache.hadoop.fs.FSDataOutputStream; import java.security.PrivilegedExceptionAction; public class HDFSProxyTest { public static void main (String[] args) throws Exception { String hadoopConfigurationPath = "/etc/hadoop/conf/"; final Configuration hdfsConfiguration = new Configuration(); FileSystem localFileSystem = FileSystem.getLocal(hdfsConfiguration); Path coreSitePath = new Path(hadoopConfigurationPath+"core-site.xml"); hdfsConfiguration.addResource(coreSitePath); Path hdfsSitePath = new Path(hadoopConfigurationPath+"hdfs-site.xml"); hdfsConfiguration.addResource(hdfsSitePath); UserGroupInformation.setConfiguration(hdfsConfiguration); UserGroupInformation.loginUserFromKeytab("striim1@FCE.CLOUDERA.COM", "/home/striim/striim1_client.keytab"); UserGroupInformation ugi = UserGroupInformation.createProxyUser("joy", UserGroupInformation.getLoginUser()); FileSystem hadoopFileSystem =ugi.doAs(new PrivilegedExceptionAction<FileSystem>() { public FileSystem run() throws Exception { return FileSystem.get(hdfsConfiguration); } }); FSDataOutputStream fsDataOutputStream = hadoopFileSystem.create(new Path("/user/striim1/hdfsproxy.csv")); fsDataOutputStream.write("This is niranjan!!! testing this\n".getBytes()); fsDataOutputStream.close(); hadoopFileSystem.close(); } }
Here this app execution user is striim and the super user I’m trying to emulate is striim1 who has Kerberos credentials and joy is the user on whose behalf I’m trying to access HDFS.
I end up with this exception.
2017-05-19 02:45:34,843 - WARN main org.apache.hadoop.util.NativeCodeLoader.<clinit> (NativeCodeLoader.java:62) Unable to load native-hadoop library for your platform... using builtin-java classes where applicableException in thread "main" org.apache.hadoop.security.AccessControlException: Permission denied: user=joy, access=WRITE, inode="/user/striim1":striim1:striim1:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:242) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:169) at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:178) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3560) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3543) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3525) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6592) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2821) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2739) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2624) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:599) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.create(AuthorizationProviderProxyClientProtocol.java:112) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:401) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2141) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1714) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2135)
This is my configuration in core-site.xml
<property> <name>hadoop.proxyuser.striim1.hosts</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.striim1.groups</name> <value>*</value> </property>
This is the permission setting of folder I’m trying to access
drwxr-xr-x - striim1 striim1 0 2017-05-19 02:50 /user/striim1
Though I understand the exception, it leads me to these following questions
1) Even though I pass super user’s UGI to the proxy user joy. Why is the client trying to create the file in the context of user joy?
2) In my cluster deployment, “striim1” is just a user who has Kerberos credentials and not really a super-user as per this definition. Would impersonation work only if “striim1” is a superuser or added to the group of super user?
3) Should the name of the user I’m trying to impersonate be a valid OS user? If not what would happen and what validation is done in this respect?
4) What should be the permission setting of directory I’m trying to write using this impersonated user? Should be it some location that is owned or part of super-user group?
5) Should UGI.createProxyUser be called explicitly in my application? Say I execute my application from the user whom I want to impersonate using super-user and I pass proxy user configuration (basically passing core-site.xm) to my application? Would this suffice? (l’m expecting something like createProxyUser being called internally by taking current app executing user as the user to be impersonated).
It would be really helpful if I get answers to these 5 questions.
Thanks in advance.
Regards,
Niranjan
Created 05-24-2017 02:14 PM
Created 05-24-2017 02:14 PM
Created on 05-24-2017 09:37 PM - edited 05-24-2017 09:37 PM
Hi Naveen,
I understand that user "joy" doesn't have sufficient permisson to write/access that folder but isn't that what impersonation is for? In this case as you can see from my code I'm trying to write on behalf of "joy" using the user "striim1" who has sufficient privileges. If I'm doing anything wrong here please let me know.
I don't understand how configuring ACL is related to achieving user impersonation. If this is sort of a prerequisite please let me know because I don't find this in any of the HDFS user impersonation docs.
Regards,
Niranjan
Created 05-30-2017 01:41 AM
Hi Naveen,
I realized my understanding is wrong and understoop that system sees "joy" as the user who is trying to write and permissions would be enforced against "joy". So I set up an ACL for joy and my program worked fine.
Now I understand ACL's usage with respect to impersonation. Thanks for the pointer.
Regards,
Niranjan