Created on 01-19-2016 05:36 AM - edited 09-16-2022 02:58 AM
Hi, I got these errors when running any scripts of tutorials for Hortonworks sandbox HDP 2.3.2 on linux:
[root@sandbox cloudapp-mp2]# bash start.sh Coursera User ID: 15677561 User Id: 15677561 Dataset: 15 Dataset Patch: 4 User Id: 15677561 Prepare the Environment /root/cloudapp-mp2 Prepare the HDFS mkdir: Permission denied: user=root, access=WRITE, inode="/mp2/links":hdfs:hdfs:drwxr-xr-x mkdir: Permission denied: user=root, access=WRITE, inode="/mp2/titles":hdfs:hdfs:drwxr-xr-x mkdir: Permission denied: user=root, access=WRITE, inode="/mp2/misc":hdfs:hdfs:drwxr-xr-x put: `/mp2/links/': No such file or directory put: `/mp2/titles/': No such file or directory put: `/mp2/misc/': No such file or directory Done
Please advice. Thanks. Zeev Lazarev
Created 01-19-2016 06:15 AM
I am not familiar with this Coursera course and Hadoop setup. What course is this?
You are getting a "permission denied"-error because you are trying to access a folder that is owned by the hdfs-user and the permissions do not allow write access from others.
A) You could use the HDFS-user to run your application/script
su hdfs
or
export HADOOP_USER_NAME=hdfs
B) Change the owner of the mp2-folder (note: to change the owner you have to be a superuser or the owner => hdfs)
hdfs dfs -chown -R <username_of_new_owner> /mp2
Created 01-19-2016 06:15 AM
I am not familiar with this Coursera course and Hadoop setup. What course is this?
You are getting a "permission denied"-error because you are trying to access a folder that is owned by the hdfs-user and the permissions do not allow write access from others.
A) You could use the HDFS-user to run your application/script
su hdfs
or
export HADOOP_USER_NAME=hdfs
B) Change the owner of the mp2-folder (note: to change the owner you have to be a superuser or the owner => hdfs)
hdfs dfs -chown -R <username_of_new_owner> /mp2
Created 05-06-2016 06:06 PM
This seems to be a security risk on hdfs. Any user wihtout having sudo su can become superuser by: export HADOOP_USER_NAME=hdfs
Created 10-05-2016 02:06 PM
Thanks Jonas. Your suggestion of exporting this env variable also works when you are trying to connect and work with a remote cluster, while retaining all the configurations of a local development environment.
Created 01-19-2016 11:10 AM
Use root does not have access to create directory under /
You can copy and paste this in your ssh window
su - hdfs
hdfs dfs -mkdir -p /mp2/links
hdfs dfs -chown -R root:hdfs /mp2/links
exit
Created 01-19-2016 07:55 PM
@Jonas Strau @Neeraj Sabharwal
Thank you Guys. Since the /mp2/links, /mp2/titles, and /mp2/misc folders dynamically created and removed, I better run on behalf of hdfs user, i.e. su - hdfs. Will try this out. Thanks. Zeev Lazarev
Created 01-19-2016 08:11 PM
@Jonas Straub @Neeraj Sabharwal
Using of hdfs user instead of root did work. Appreciate. Zeev
Created 01-20-2016 05:29 AM
Awesome, good to hear. Good Luck with your Coursera course 🙂