Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Permission error in HDP 2.4 when trying to import mySQL table to Hive using SQOOP

avatar
Explorer

I am using HDP 2.4 sandbox. I created a few tables in mysql ( in a new database ) and i am trying to import into hive database called "cards". But is i failing with below error after creating first table. Same command works fine in HDP 2.3 sandbox.

I had given 777 permission for all the folders as mentioned in below post , but still i get the error

https://community.hortonworks.com/questions/9984/permission-denied-using-sqoop-from-mysql-on-sandbo.... https://community.hortonworks.com/questions/10949/permission-denied-for-user-while-creating-a-hive-t...

SQOOP command: sqoop import-all-tables --connect "jdbc:mysql://sandbox.hortonworks.com:3306/retail_db" --username retail_dba --hive-database cards -m 1 --warehouse-dir /apps/hive/warehouse/cards.db --hive-import --create-hive-table --hive-overwrite --compress --driver com.mysql.jdbc.Driver --password hadoop;

Error Message:

16/05/17 07:20:42 INFO mapreduce.ImportJobBase: Transferred 576 bytes in 20.5035 seconds (28.0928 bytes/sec)
16/05/17 07:20:42 INFO mapreduce.ImportJobBase: Retrieved 58 records.
16/05/17 07:20:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM categories AS t WHERE 1=0
16/05/17 07:20:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM categories AS t WHERE 1=0
16/05/17 07:20:42 INFO hive.HiveImport: Loading uploaded data into Hive
Logging initialized using configuration in jar:file:/usr/hdp/2.4.0.0-169/hive/lib/hive-common-1.2.1000.2.4.0.0-169.jar!/hive-log4j.properties
OK
Time taken: 8.528 seconds
Loading data to table cards.categories
Moved: 'hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/cards.db/categories/part-m-00000.gz' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Failed with exception Unable to move source hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/cards.db/categories/part-m-00000.gz to destination hdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/cards.db/categories/part-m-00000.gz
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
1 ACCEPTED SOLUTION

avatar
Master Guru

Remove --warehouse-dir from your Sqoop command or move it to some neutral location, for example under your home directory in hdfs. Note that /apps/hive/warehouse is the place where Hive keeps its managed (internal) tables. Nobody else is supposed to write anything there. On the other hand, warehouse-dir in Sqoop is a temporary location used during Hive import; it's also used as the default location for Sqoop hdfs import when --target-dir is not provided.

View solution in original post

6 REPLIES 6

avatar
Expert Contributor

Just curious. How did you conclude that it is a permission issue:

The error shows that "Unable to move sourcehdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/cards.db/categories/part-m-00000.gz to destinationhdfs://sandbox.hortonworks.com:8020/apps/hive/warehouse/cards.db/categories/part-m-00000.gz FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask"

That says nothing about permission. Did you take a look at yarn app log?

avatar
Explorer

org.apache.hadoop.hive.ql.exec.MoveTask , this error is using associated with permission. But i will check yarn logs too

avatar
Master Guru

Remove --warehouse-dir from your Sqoop command or move it to some neutral location, for example under your home directory in hdfs. Note that /apps/hive/warehouse is the place where Hive keeps its managed (internal) tables. Nobody else is supposed to write anything there. On the other hand, warehouse-dir in Sqoop is a temporary location used during Hive import; it's also used as the default location for Sqoop hdfs import when --target-dir is not provided.

avatar
Explorer

i am trying to import tables to a specific database created in hive called 'cards'. it works just fine if i import to hdfs , but fails when try to create a table in hive in cards database.

avatar
Master Guru

You have already told Sqoop about your intention using the "--hive-database" option. By your choice of --warehouse-dir location you are trying to "assist" Sqoop in doing it, but you are actually obstructing it. Because when importing into Hive, Sqoop first imports table files into --warehouse-dir and from there into Hive warehouse. So, in your case files are already there but Sqoop is unaware of that and tries to move them into the same location, which by default causes a failure in hdfs. As a solution, as I mentioned above, just drop your --warehouse-dir option and retry.

avatar
Explorer

You are perfectly correct. I tired with out warehouse and with a different part for warehouse,both worked just fine, Thanks for your explanation.