Support Questions

Find answers, ask questions, and share your expertise
Announcements
Welcome to the upgraded Community! Read this blog to see What’s New!

Insert overwrite query failing with Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask

avatar

I am a running a insert overwrite query as below.

Insert overwrite directory '/org/data/tmp/webapptempspace/UC3/log' select a.* from a join b on ucase(a.name)=ucase(b.name);

It works fine when one of the table has smaller dataset, but when both tables have huge data, it throws the below error.

Failed with exception Unable to move source /org/data/tmp/webapptempspace/UC3/log/.hive-staging_hive_2016-01-11_04-31-06_067_6297667876520454770-1/-ext-10000
to destination 
15 REPLIES 15

avatar

@pooja khandelwal Please check the hive log details and see if you have permission to write.

avatar

Yes, I have permission to write. I am able to run the query for a smaller dataset, but not for a larger dataset.

avatar
New Contributor

I met the same problem. @pooja khandelwal has this been resolved? Can you provide your solution or accept the best answer?

avatar
New Contributor

This could be permission issue. you can see  the hive server2 log for the error. Log will be in /var/log/hive on the node to which you connect the hive 

avatar

My table 'a' has text format and 'b' has orc format. When I keep both as text, it is working fine.

avatar
Explorer

Is there any more information in the logs? (If you're using HiveServer2, the you can look at those logs, or your client logs if you are using the Hive client directly.) The error is caused by a file system error where it's unable to move the file. The file system should log an error message describing why it failed.

avatar
Rising Star

What MoveTask does is moving some files in /tmp volume to /user volume. When the user running doesn't have the right permissions it does not allow moving files between volumes and throws this exception.

Possible workarounds-

-Check /user and /tmp have the full permission

-Check if the below guys are set to true

hive.metastore.client.setugi=true and

hive.metastore.server.setugi=true. These parameters instruct hive execute jobs under current shell user

if not then try executing it as root.

avatar
Rising Star

avatar
Mentor

@pooja khandelwal has this been resolved? Can you provide your solution or accept the best answer?

avatar
New Contributor

I come across the same issue. But when I look at hive logs as suggested by Neeraj, I got to know that there were Space Quota set at my directory level. Once I cleared the quota, I was able to create table even for large data sets.

avatar

Hey can you let me know how did you do it.

avatar
Expert Contributor

Hi All,

I am also facing the same issue. I am logged into Ambari with admin user and trying to create /move files from /to under /user/root/satish/ . Here are the details on the folder permissions.

[root@sandbox sat]# hadoop fs -ls /user/root/satish/Found 3 itemsdrwxr-xr-x - root hdfs0 2017-06-14 00:54 /user/root/satish/inputdrwxr-xr-x - root hdfs0 2017-06-14 00:55 /user/root/satish/outputdrwxr-xr-x - root hdfs0 2017-06-14 00:55 /user/root/satish/scripts

Even I tried with different path too, i am getting the same error. Please let me know if I am missing anything here.

Error:

java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask

avatar
Rising Star

I am facing same issue. java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask SQL Exception Occured

Any pointer is appreciated.

avatar

I faced similar issue. fortunatly for me , the destination dir was not present. once i create it. the error got resolved.

avatar
New Contributor

I am getting the same error when I am using INSERT OVERWRITE but same query is getting executed with INSERT INTO.Please let me know the solution.

Labels