Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

mkdir: Permission denied: user=root, access=WRITE, inode="/mp2/links":hdfs:hdfs:drwxr-xr-x

avatar
Contributor

Hi, I got these errors when running any scripts of tutorials for Hortonworks sandbox HDP 2.3.2 on linux:

[root@sandbox cloudapp-mp2]# bash start.sh
Coursera User ID: 15677561
User Id: 15677561
Dataset: 15
Dataset Patch: 4
User Id: 15677561
Prepare the Environment
/root/cloudapp-mp2
Prepare the HDFS
mkdir: Permission denied: user=root, access=WRITE, inode="/mp2/links":hdfs:hdfs:drwxr-xr-x
mkdir: Permission denied: user=root, access=WRITE, inode="/mp2/titles":hdfs:hdfs:drwxr-xr-x
mkdir: Permission denied: user=root, access=WRITE, inode="/mp2/misc":hdfs:hdfs:drwxr-xr-x
put: `/mp2/links/': No such file or directory
put: `/mp2/titles/': No such file or directory
put: `/mp2/misc/': No such file or directory
Done

Please advice. Thanks. Zeev Lazarev

1 ACCEPTED SOLUTION

avatar

I am not familiar with this Coursera course and Hadoop setup. What course is this?

You are getting a "permission denied"-error because you are trying to access a folder that is owned by the hdfs-user and the permissions do not allow write access from others.

A) You could use the HDFS-user to run your application/script

su hdfs

or

export HADOOP_USER_NAME=hdfs

B) Change the owner of the mp2-folder (note: to change the owner you have to be a superuser or the owner => hdfs)

hdfs dfs -chown -R <username_of_new_owner> /mp2

View solution in original post

7 REPLIES 7

avatar

I am not familiar with this Coursera course and Hadoop setup. What course is this?

You are getting a "permission denied"-error because you are trying to access a folder that is owned by the hdfs-user and the permissions do not allow write access from others.

A) You could use the HDFS-user to run your application/script

su hdfs

or

export HADOOP_USER_NAME=hdfs

B) Change the owner of the mp2-folder (note: to change the owner you have to be a superuser or the owner => hdfs)

hdfs dfs -chown -R <username_of_new_owner> /mp2

avatar
Expert Contributor

@Jonas Straub

This seems to be a security risk on hdfs. Any user wihtout having sudo su can become superuser by: export HADOOP_USER_NAME=hdfs

avatar
New Contributor

Thanks Jonas. Your suggestion of exporting this env variable also works when you are trying to connect and work with a remote cluster, while retaining all the configurations of a local development environment.

avatar
Master Mentor
@Zeev Lazarev

Use root does not have access to create directory under /

You can copy and paste this in your ssh window

su - hdfs

hdfs dfs -mkdir -p /mp2/links

hdfs dfs -chown -R root:hdfs /mp2/links

exit

avatar
Contributor

@Jonas Strau @Neeraj Sabharwal

Thank you Guys. Since the /mp2/links, /mp2/titles, and /mp2/misc folders dynamically created and removed, I better run on behalf of hdfs user, i.e. su - hdfs. Will try this out. Thanks. Zeev Lazarev

avatar
Contributor

@Jonas Straub @Neeraj Sabharwal

Using of hdfs user instead of root did work. Appreciate. Zeev

avatar

Awesome, good to hear. Good Luck with your Coursera course 🙂