Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

sqoop script error "... is not a valid DFS filename"

New Contributor

 

I have a sqoop script, trying to copy a table from oracle to hive. I get an error  regarding my destination path...

 

/hdfs://myserver/apps/hive/warehouse/new_schema/new_table is not a valid DFS filename

 

 

 

Can anyone please tell me if my destination path looks correct? I am not trying to setup a file, I just want to copy a table from oracle to hive and put it in a scheme that already exists in hive. Below is my script. 

 

 

#!/bin/bash

sqoop import \
-Dmapred.map.child.java.opts='-Doracle.net.tns_admin=. -Doracle.net.wallet_location=.' \
-files $WALLET_LOCATION/cwallet.sso,$WALLET_LOCATION/ewallet.p12,$TNS_ADMIN/sqlnet.ora,$TNS_ADMIN/tnsnames.ora \
--connect jdbc:oracle:thin:/@MY_ORACLE_DATABASE \
--table orignal_schema.orignal_table \
--hive-drop-import-delims \
--hive-import \
--hive-table new_schema.new_table \
--num-mappers 1 \
--hive-overwrite \
--mapreduce-job-name my_sqoop_job \
--delete-target-dir \
--target-dir /hdfs://myserver/apps/hive/warehouse/new_schema.db \
--create-hive-table

 

 

2 REPLIES 2

Rising Star

Your path should not include the "/hdfs://myserver" section.

Also, I think you want to use the --warehouse_dir option.

 

–warehouse-dir is used to specify a base directory within hdfs where SQOOP will create a sub folder inside with the name of the source table, and import the data files into that folder.

New Contributor

"Your path should not include the "/hdfs://myserver" section."

    excellent!!! I wasnt sure about that part. 

 

Do I just add the line of --warehouse_dir option?  just like that on its own line?

 

 

 

 

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.