Support Questions

Find answers, ask questions, and share your expertise

Unable to overcome this exception (HiveAccessControlException Permission denied: user [admin] does not have [READ] )

avatar
Contributor

I am trying to run this sample, http://hortonworks.com/hadoop-tutorial/how-to-refine-and-visualize-server-log-data/

sudo -u hdfs hadoop fs -chmod -R 777/flume

/flume - no such file or directory.

1587-virtualbox-hortonworks-sandbox-with-hdp-232-27-01.png

1 ACCEPTED SOLUTION

avatar
Master Mentor

Hi @Sai ram

Your question title and description does not match.

I have looked into the article and looks like /flume does not exist.

if the query doesn’t run successfully due to a permissions error you then you might need to update the permission on the directory. Run the following commands over SSH on the Sandbox

<code>sudo -u hdfs hadoop fs -mkdir /flume
<code>sudo -u hdfs hadoop fs -chmod -R 777 /flume
sudo -u hdfs hadoop fs -chown -R admin /flume

View solution in original post

7 REPLIES 7

avatar
Master Mentor

@Sai ram

sudo -u hdfs hdfs -mkdir /flume

sudo -u hdfs hdfs -chmod -R 777 /flume

avatar
Contributor

avatar
Master Mentor

@Sai ram

Sorry

Sudo -u hdfs hdfs dfs -mkdir /flume

Sudo -u hdfs hdfs dfs -chmod -R 777 /flume

It's too early 🙂

avatar
Master Mentor

Hi @Sai ram

Your question title and description does not match.

I have looked into the article and looks like /flume does not exist.

if the query doesn’t run successfully due to a permissions error you then you might need to update the permission on the directory. Run the following commands over SSH on the Sandbox

<code>sudo -u hdfs hadoop fs -mkdir /flume
<code>sudo -u hdfs hadoop fs -chmod -R 777 /flume
sudo -u hdfs hadoop fs -chown -R admin /flume

avatar
Contributor

Hi @Neeraj Sabharwal

I still get this error,

1589-capture.png

avatar
Master Mentor

@Sai ram

login as root and run this

sudo -u hdfs hadoop fs -chmod -R 777 /flume

avatar
Guru

Hi @Sai ram ,

looks like you are using Ranger and you do not have a Ranger-HDFS-policy which allows the user hive to write to "/flume"

On the one hand, the solution from @Neeraj Sabharwal is granting permissions on HDFS level and solves your problem, on the other hand, if you want to go with Ranger I'd recommend to create/adjust Ranger-HDFS-policies for certain folders/users (and do a, at least, chmod 700 on HDFS level itself to prevent accessing folders/files "by accident")