Member since
11-15-2019
2
Posts
0
Kudos Received
0
Solutions
01-15-2021
05:15 AM
"Your path should not include the "/hdfs://myserver" section." excellent!!! I wasnt sure about that part. Do I just add the line of --warehouse_dir option? just like that on its own line?
... View more
01-14-2021
11:21 AM
I have a sqoop script, trying to copy a table from oracle to hive. I get an error regarding my destination path... /hdfs://myserver/apps/hive/warehouse/new_schema/new_table is not a valid DFS filename Can anyone please tell me if my destination path looks correct? I am not trying to setup a file, I just want to copy a table from oracle to hive and put it in a scheme that already exists in hive. Below is my script. #!/bin/bash
sqoop import \
-Dmapred.map.child.java.opts='-Doracle.net.tns_admin=. -Doracle.net.wallet_location=.' \
-files $WALLET_LOCATION/cwallet.sso,$WALLET_LOCATION/ewallet.p12,$TNS_ADMIN/sqlnet.ora,$TNS_ADMIN/tnsnames.ora \
--connect jdbc:oracle:thin:/@MY_ORACLE_DATABASE \
--table orignal_schema.orignal_table \
--hive-drop-import-delims \
--hive-import \
--hive-table new_schema.new_table \
--num-mappers 1 \
--hive-overwrite \
--mapreduce-job-name my_sqoop_job \
--delete-target-dir \
--target-dir /hdfs://myserver/apps/hive/warehouse/new_schema.db \
--create-hive-table
... View more
Labels:
- Labels:
-
Apache Sqoop