I come across the same issue. But when I look at hive logs as suggested by Neeraj, I got to know that there were Space Quota set at my directory level. Once I cleared the quota, I was able to create table even for large data sets.
Hey can you let me know how did you do it.
I am also facing the same issue. I am logged into Ambari with admin user and trying to create /move files from /to under /user/root/satish/ . Here are the details on the folder permissions.
[root@sandbox sat]# hadoop fs -ls /user/root/satish/Found 3 itemsdrwxr-xr-x - root hdfs0 2017-06-14 00:54 /user/root/satish/inputdrwxr-xr-x - root hdfs0 2017-06-14 00:55 /user/root/satish/outputdrwxr-xr-x - root hdfs0 2017-06-14 00:55 /user/root/satish/scripts
Even I tried with different path too, i am getting the same error. Please let me know if I am missing anything here.
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
I am facing same issue. java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask SQL Exception Occured
Any pointer is appreciated.
I faced similar issue. fortunatly for me , the destination dir was not present. once i create it. the error got resolved.
I am getting the same error when I am using INSERT OVERWRITE but same query is getting executed with INSERT INTO.Please let me know the solution.