Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Impala SQL: Unable to LOAD DATA from HDFS path due to WRITE permissions

avatar
New Contributor

Hi all,

 

I'm using Impala Official docker image "cloudera/quickstart".
I can upload a TEXT-formatted file to a HDFS location. However, when I executed LOAD DATA command to do data migration, I received following error:

[Simba]ImpalaJDBCDriver ERROR processing query/statement. Error Code: 0, SQL state: TStatus(statusCode:ERROR_STATUS, sqlState:HY000, errorMessage:AnalysisException: Unable to LOAD DATA from hdfs://quickstart.cloudera:8020/user/customer.tbl.1 because Impala does not have WRITE permissions on its parent directory hdfs://quickstart.cloudera:8020/user ), Query: load data inpath '/user/customer.tbl.1' overwrite into table my_table. [SQL State=HY000, DB Errorcode=500051]

My sql is :

load data inpath '/user/customer.tbl.1' overwrite into table my_table

Is there anyone know what is going on?

 

Thanks for any help!

1 ACCEPTED SOLUTION

avatar
Champion

@yexianyi

 

 

The actual answer for your question is you need to change the owner/group of /user/customer.tbl.1 accessible by hive/impala

 

In addition to that, the default cloudera recommended path to maintain hive/impala table is "/user/hive/warehouse/"

 

So in your case, create a DB called customer in the default path as follows and make sure owner/group accessible by hive/impala and try again

 

hdfs dfs -ls /user/hive/warehouse/customer.db

 

hdfs dfs -ls /user/hive/warehouse

drwxrwxrwt   - hive       hive           0 2016-11-25 15:11 /user/hive/warehouse/customer.db

View solution in original post

3 REPLIES 3

avatar
Champion

@yexianyi

 

 

The actual answer for your question is you need to change the owner/group of /user/customer.tbl.1 accessible by hive/impala

 

In addition to that, the default cloudera recommended path to maintain hive/impala table is "/user/hive/warehouse/"

 

So in your case, create a DB called customer in the default path as follows and make sure owner/group accessible by hive/impala and try again

 

hdfs dfs -ls /user/hive/warehouse/customer.db

 

hdfs dfs -ls /user/hive/warehouse

drwxrwxrwt   - hive       hive           0 2016-11-25 15:11 /user/hive/warehouse/customer.db

avatar
New Contributor
Thank u so much!! It's working now.

avatar
New Contributor

Is there a way to change hdfs permissions when importing files via sqoop?

 

https://stackoverflow.com/questions/49759591/sqoop-how-to-change-hdfs-permissions-on-imported-files