Archives of Support Questions (Read Only)

This is an archived board for historical reference. Information and links may no longer be available or relevant
Announcements
This board is archived and read-only for historical reference. To ask a new question, please post a new topic on the appropriate active board.

[CDH 5.3] Hue uses wrong user to run query?

avatar
Explorer

Hey,

I've just set up an env of CDH 5.3 [from CM 5.3] and wanted to check some things with Hive. So i tried to use Hue to do it and I got this error:

 

Error while compiling statement: FAILED: RuntimeException Cannot create staging directory 'hdfs://h1.t.pl:8020/jobs/input/customer/.hive-staging_hive_2014-12-30_21-48-01_539_4882491936626833078-1': Permission denied: user=admin, access=WRITE, inode="/jobs/input/customer":hdfs:supergroup:drwxrwxr-x 
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)

admin is an user i'm logged into hue webui - default user created with install of cloudera manager.

Is it configuration issue?

 

[edit]

tables were created from external files using query run through hue:

create external table customer (
  C_CUSTKEY int,
  C_NAME string,
  C_ADDRESS string,
  C_NATIONKEY int,
  C_PHONE string,
  C_ACCTBAL decimal(10,2),
  C_MKTSEGMENT string,
  C_COMMENT string
)
row format delimited fields terminated by '|' stored as textfile
location '/jobs/input/customer';

 

1 ACCEPTED SOLUTION

avatar
Explorer

Well, so my problem was kind'a linux related I guess., with file permissions I had 2 logical partitions which were mounted at / and /home. By default I had dir /dfs on / partition and when i was low with storage space i decided to create a new directory for dfs under /home. Of course I've made chown to hdfs user for this folder (as is for /dfs) but it gave me error from the first post.

After I resized partition size and left only one dir /dfs as a storage point for hdfs on node everything worked out fine.

View solution in original post

4 REPLIES 4

avatar
Super Guru
You seem to have a custom mapreduce.jobtracker.staging.root.dir:
/jobs/input/customer

It means only 'hdfs' user can submit jobs. You could change it to 777
permissions. I think there is also a way to have one by user.

Romain

avatar
Explorer

Well, so my problem was kind'a linux related I guess., with file permissions I had 2 logical partitions which were mounted at / and /home. By default I had dir /dfs on / partition and when i was low with storage space i decided to create a new directory for dfs under /home. Of course I've made chown to hdfs user for this folder (as is for /dfs) but it gave me error from the first post.

After I resized partition size and left only one dir /dfs as a storage point for hdfs on node everything worked out fine.

avatar
Super Guru
Thanks for reporting your solution!

avatar
Explorer

Yeah, I always try to do that because I personally hate when I find a similar problem to mine and only a reply from the author that he managed to resolve that already but doesn't explain how xP

 

regards,

ive