Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

sqoop script error "... is not a valid DFS filename"

avatar
New Contributor

 

I have a sqoop script, trying to copy a table from oracle to hive. I get an error  regarding my destination path...

 

/hdfs://myserver/apps/hive/warehouse/new_schema/new_table is not a valid DFS filename

 

 

 

Can anyone please tell me if my destination path looks correct? I am not trying to setup a file, I just want to copy a table from oracle to hive and put it in a scheme that already exists in hive. Below is my script. 

 

 

#!/bin/bash

sqoop import \
-Dmapred.map.child.java.opts='-Doracle.net.tns_admin=. -Doracle.net.wallet_location=.' \
-files $WALLET_LOCATION/cwallet.sso,$WALLET_LOCATION/ewallet.p12,$TNS_ADMIN/sqlnet.ora,$TNS_ADMIN/tnsnames.ora \
--connect jdbc:oracle:thin:/@MY_ORACLE_DATABASE \
--table orignal_schema.orignal_table \
--hive-drop-import-delims \
--hive-import \
--hive-table new_schema.new_table \
--num-mappers 1 \
--hive-overwrite \
--mapreduce-job-name my_sqoop_job \
--delete-target-dir \
--target-dir /hdfs://myserver/apps/hive/warehouse/new_schema.db \
--create-hive-table

 

 

2 REPLIES 2

avatar
Expert Contributor

Your path should not include the "/hdfs://myserver" section.

Also, I think you want to use the --warehouse_dir option.

 

–warehouse-dir is used to specify a base directory within hdfs where SQOOP will create a sub folder inside with the name of the source table, and import the data files into that folder.

avatar
New Contributor

"Your path should not include the "/hdfs://myserver" section."

    excellent!!! I wasnt sure about that part. 

 

Do I just add the line of --warehouse_dir option?  just like that on its own line?