Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

credential provider for sqoop doesnot work when kerberos key is refreshed

credential provider for sqoop doesnot work when kerberos key is refreshed

I have following sqoop to import data from oracle to hive instance.

 

sqoop import -Dhadoop.security.credential.provider.path=${keystore_custom} \
-Dmapred.job.queue.name=${queue} \
--connect jdbc:oracle:thin:@${hostname}:${port}:${custom_db} \
--username ${username} \
--password-alias ${custom_alias} \
--query 'SELECT * FROM table C where $CONDITIONS' \
--hive-import \
--delete-target-dir \
--null-string '\\N' \
--null-non-string '\\N' \
--hive-drop-import-delims \
--hive-table hive_instance.table \
--target-dir /user/hive_instance/table -m 1;

 

Above sqoop works normally when triggered from edge node. But when kerberos ticket is refreshed using kinit -R or  kinit user@realm -k -t /home/user/keytab/user.keytab, credential providers doesnot work with the following error.

 

Warning: /opt/cloudera/parcels/CDH-5.9.1-1.cdh5.9.1.p0.4/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/04/08 08:42:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.9.1
Error resolving password  from the credential providers.

 

When i remove the --password-alias and replaced it with --password i have lost write permission to the hdfs fs temporarly. So with kinit being executed i lose my write permission to hdfs fs.

 

Above issue doesnot occur when tried from new session without kerberos ticket being refreshed.