Support Questions

Find answers, ask questions, and share your expertise

Permission denied: user=admin

avatar
Expert Contributor

I installed my HDP Cluster without any errors. I got 100% success and I could see in Ambari that all services were started.

But I notice that under user there is no directory for admin. (I am logged into amber as admin/admin).

When I try to upload a file on HDFS I get error

Permission denied: user=admin, access=WRITE, inode="/foo.py":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:353) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:325) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:246) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1950) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1934) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1917) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2767) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2702) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2586) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:736)

I googled and found this

https://community.hortonworks.com/questions/97703/hive-write-permission-denied.html

But for me the solution above does not work because there is no directory called `/user/admin`

Edit: I solved the problem with these two commands

sudo -u hdfs hadoop fs -mkdir /user/admin

sudo -u hdfs hadoop fs -chown -R admin:admin /user/admin

After this I can work in these directories. but I am curios to know why out o the box the admin user has no permissions in my cluster (In the Hortonworks sandbox. admin user can work on all directories).

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Knows NotMuch

In order to fulfill the common requirement to initialize user accounts to run Hadoop components is the existence of a unique,/user/<username> HDFS homedirectory. You can enable automated creation of a/user/<username> HDFS homedirectory for each user that you create.

Home directory creation occurs for users created either manually using the Ambari Admin page, or through LDAP synchronization.

Please find the below link to know more about the new ambari.properties property and about the script "post-user-creation-hook.sh"

ambari.post.user.creation.hook.enabled=true

Script:

ambari.post.user.creation.hook=/var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-administration/content/create_use...

View solution in original post

2 REPLIES 2

avatar
Master Mentor

@Knows NotMuch

In order to fulfill the common requirement to initialize user accounts to run Hadoop components is the existence of a unique,/user/<username> HDFS homedirectory. You can enable automated creation of a/user/<username> HDFS homedirectory for each user that you create.

Home directory creation occurs for users created either manually using the Ambari Admin page, or through LDAP synchronization.

Please find the below link to know more about the new ambari.properties property and about the script "post-user-creation-hook.sh"

ambari.post.user.creation.hook.enabled=true

Script:

ambari.post.user.creation.hook=/var/lib/ambari-server/resources/scripts/post-user-creation-hook.sh

https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-administration/content/create_use...

avatar
Master Mentor

@Knows NotMuch

In Sandbox it works because Sandbox includes a pre configured cluster with pre configured users and their home directories.