Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

External Table creation error/ permission denied

avatar
Expert Contributor

Hi

I am getting permission issue while creating external table. I logged in as root.

hive> create external table users ( > user_id INT, > age INT, > gender STRING, > occupation STRING, > zip_code STRING > ) > ROW FORMAT DELIMITED > FIELDS TERMINATED BY '|' > STORED AS TEXTFILE > LOCATION '/myid/userinfo'; FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/":hdfs:hdfs:drwxr-xr-x

1 ACCEPTED SOLUTION

avatar
Master Guru

@Tech Guy Please take a look at this post.

You are getting a "permission denied"-error because you are trying to access a folder using "root" user that is owned by the hdfs-user and the permissions do not allow write access from others. Change owners of / to root.

sudo su - hdfs then

hdfs dfs -chown -R root /

Or if you don't want to change the permissions, launch hive using hdfs use

sudo su - hdfs

then run hive. no more permission issue.

View solution in original post

3 REPLIES 3

avatar
Super Guru

@Tech Guy

One option is to try first to sudo as hive (or any user that had hdfs privileges) and then launch hive and execute your DDL statement

$sudo su hive

$hive

... then your DDL.

The reason is that you still need some hdfs write privileges for hive metadata.

If the response addresses your problem, vote/accept as the best answer.

avatar
Master Guru

@Tech Guy Please take a look at this post.

You are getting a "permission denied"-error because you are trying to access a folder using "root" user that is owned by the hdfs-user and the permissions do not allow write access from others. Change owners of / to root.

sudo su - hdfs then

hdfs dfs -chown -R root /

Or if you don't want to change the permissions, launch hive using hdfs use

sudo su - hdfs

then run hive. no more permission issue.

avatar
New Member

Why does it require write permissions to create an external table? If we have huge read-only data which we want the various users to query without duplicating, what should we do?