Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hands-on Spark Tutorial: Permission Denied and no such file or directory

avatar
Explorer

I was able to download and bring up the sandbox. I also changed the password root user. I am also able navigate to http://127.0.0.1:8088/ and see the hadoop cluster. Now I am trying to go through following tutorial:

http://hortonworks.com/blog/hands-on-tour-of-apach...

but running in to permissions issues or message like no such file or directory.

I was able to run this command and download data:

wget <a href="http://en.wikipedia.org/wiki/Hortonworks">http://en.wikipedia.org/wiki/Hortonworks</a>

but when I run the next step:

hadoop fs -put ~/Hortonworks /user/guest/Hortonworks

I get following message:

1041-horton1.jpg

When I try to run hadoop fs -ls , I get the message the no such file or directory exists:

1042-horton2.jpg

I then tried to create the /home/root directory but that failed too:

1043-horton3.jpg

Any thoughts on what may be happening here?

Thanks

1 ACCEPTED SOLUTION

avatar
Master Mentor

@shahab nasir

Do this

sudo su - hdfs

hdfs dfs -mkdir /user/root

hdfs dfs -chown root:hdfs /user/root

exit

and run hdfs commands

View solution in original post

6 REPLIES 6

avatar
Master Mentor

@shahab nasir

Do this

sudo su - hdfs

hdfs dfs -mkdir /user/root

hdfs dfs -chown root:hdfs /user/root

exit

and run hdfs commands

avatar
Explorer

thanks Neeraj. That worked. Appreciate it.

avatar

Specifically for /user/guest/Hortonworks do this:

sudo su -hdfs
hadoop fs -chmod -R 777 /user/guest
exit

avatar

In the Sandbox, the /user/guest does not actually exist. You can create /user/guest/ using the following command:

hadoop fs -mkdir /user/guest/

Alternatively, you can use /tmp directory:

hadoop fs -put ~/Hortonworks /tmp

The tutorial has been updated to use /tmp directory to simplify the flow.

avatar

The tutorial needs to be fixed.

avatar

@shahab nasir

Best is to use the ambari-qa user. This is special user with super powers 🙂

su ambari-qa

Overall, please understand the Hadoop security model. Spend some time to understand the user permissions. This is mostly like unix. The service accounts hdfs, yarn etc are service accounts that are part of hadoop group.

Spend some time on the Hadoop HDFS section. This will help your understanding better.