Below is the sqoop command which I am trying.
sqoop import --connect "jdbc:oracle:thin:@orac-prd03-vip.healthnet.com:1725/odsprd" --username "ods" --password "Odsmr23prod$" --query "select client_id,host_system_id,client_name,insert_timestamp from CLIENT_MASTER where INSERT_TIMESTAMP>=TO_DATE('06112017','DDMMYYYY') AND \$CONDITIONS" --delete-target-dir --hive-import --hive-overwrite -m 1 --hive-table etl_wds_tmp.mr_client_master_external_test11 --target-dir "hdfs://CENTENEHADOOP2/etl/wds/tmp/mr_client_master_external_test11" --fields-terminated-by '|'
And I am getting below error.
"Failed with exception Unable to move source hdfs://CENTENEHADOOP2/etl/wds/tmp/mr_client_master_external_test11/part-m-00000 to destination hdfs://CENTENEHADOOP2/etl/wds/tmp/mr_client_master_external_test11/part-m-00000 FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask"
Which is the user you are running the shell as? Are you able to write a file to that dir? If yes, then it could be the tmp cache that is getting cleared and the process getting killed abnormal.
When I change the permission at "/etl/wds/tmp/mr_client_master_external_test11/" and run the sqoop command, Again its going back to previous permissions.
What is the user name you are using to run Sqoop job ?
Ensure that that user has got permissions to write to folder hdfs://CENTENEHADOOP2/etl/wds/tmp/mr_client_master_external_test11/.
1) Change the permissions of the directory hdfs://CENTENEHADOOP2/etl/wds/tmp/mr_client_master_external_test11/, So that your user can write to it.
2) Choose the user who has write permissions to directory hdfs://CENTENEHADOOP2/etl/wds/tmp/mr_client_master_external_test11/.