Support Questions

Find answers, ask questions, and share your expertise

Error importing data from MySQL to HDFS

avatar
Expert Contributor

Hi guys I'm using sqoop to import the data from MySQL to HDFS with the option import-all-tables as below:

sqoop import-all-tables --connect jdbc:mysql://master/poc --username root --target-dir /user/hdfs/mysql --mysql-delimiters -m 1

My problem is that i got an error messages regarding the parameter --target-dir is wrong but i have checked the documentation and it's correct , when i run the same but pointing in a local path using --warehouse-dir that works. could someone tell me where I'm wrong? thanks error attached

15742-sqoop-error.png

1 ACCEPTED SOLUTION

avatar

@Andres Urrego

"import-all-tables" does not support "--target-dir". As you've discovered, "--warehouse-dir" should be used instead. Data for each table will be put in a subfolder in the designated warehouse-dir path.

As always, if you find this post helpful, don't forget to "accept" answer.

View solution in original post

6 REPLIES 6

avatar

@Andres Urrego

"import-all-tables" does not support "--target-dir". As you've discovered, "--warehouse-dir" should be used instead. Data for each table will be put in a subfolder in the designated warehouse-dir path.

As always, if you find this post helpful, don't forget to "accept" answer.

avatar
Expert Contributor

thanks so much @Eyad Garelnabi let me ask you something so if i need to move all tables to HDFS i need to move them first to local and then to HDFS or create one sqoop job by table?? OMG thanks

avatar

@Andres Urrego

Neither. Just use the "--warehouse-dir" flag with "import-all-tables". The directory you specify does not need to be a Hive warehouse directory. It can be anything and anywhere you specify in HDFS.

The reason you're unable to use "--target-dir" is because that option is only available when all the imported data is to be placed in the one particular folder, whereas "import-all-tables" needs to create subfolders for each table. The "--warehouse-dir" flag only indicates the parent folder where you want all the data to go, and "import-all-tables" would be able to create subdirectories for each table brought in.

I've assumed with the above that you want to import all tables. However, if you only want to import a few tables then your best bet is to write a (shell/perl/python/etc...) script that runs multiple Sqoop commands, with each one importing a table.

Does that clarify things?

avatar
Expert Contributor

I have found that both statements refers to HDFS , so what can i use to put data in local paths, out from HDFS

15747-sqoop.png

?

avatar

You can't. Sqoop can only be used to import from RDBMS to HDFS (and vice versa). It does not work with other file system interfaces.

avatar
Expert Contributor

No sr, thanks so much for your help @Eyad Garelnabi