Support Questions

Find answers, ask questions, and share your expertise

Is it possible to use jceks file with Sqoop Export command?

avatar
Expert Contributor

Hi. I have a problem with exporting Hive table to Oracle database. I wanna encrypt and hide password using jceks. I read great article about using jceks while importing data using Sqoop: Storing Protected Passwords in Sqoop

It works great when I import data from Oracle to Hive. But the problem is that when I try to export data from Hive to Oracle I get an error:

Unable to process alias

My Sqoop command which I try to run:

sqoop export \
-Dhadoop.security.credential.provider.path=jceks://hdfs/user/hdfs/pass-enc.jceks \
--connect jdbc:oracle:thin:@1.1.1.1:2222:SID \
--table hive_temp_table_orc \
--username orc_user \
--password-alias oracle.password \
--hcatalog-database default \
--hcatalog-table hive_temp_table  \
--hive-partition-key col1 \
--hive-partition-value 2011-01-01

My question is - is that possible to use jceks and --password-alias parameter with Sqoop export command? Or is it an option only when I importing data?

2 REPLIES 2

avatar

Hi @Mateusz Grabowski!
Could you enable the DEBUG on logs?
I'm looking for a specific error msg
https://github.com/apache/sqoop/blob/3233db8e1c481e38c538f4caaf55bcbc0c11f208/src/java/org/apache/sq...

https://github.com/apache/sqoop/blob/0ca73d4e71bf4724cd7dd15faa108e6ee56ee121/src/java/org/apache/sq...

Not sure if you'll be able to do this with sqoop export, i didn't saw anything about credentials+password-alias to export mode on the documentation, but let's investigate it further 🙂

Hope this helps

avatar
New Contributor

Check name of alias

hadoop credential list -provider jceks://hdfs/user/hdfs/pass-enc.jceks