How to move data from a relational database system via sqoop command to Hadoop with encrypted password?
Here you go
Here's a quick Sqoop command example:
sqoop import \ -Dorg.apache.sqoop.credentials.loader.class=org.apache.sqoop.util.password.CryptoFileLoader \ -Dorg.apache.sqoop.credentials.loader.crypto.passphrase=sqoop2 \ --connect jdbc:mysql://example.com/sqoop \ --username sqoop \ --password-file file:///tmp/pass.enc \ --table tbl
The important parameters to note are:
There are several other options available to you as well more.
@Nilesh Sqoop whats your sqoop version ? 1.4.5
The most secure way of supplying password to the database is to save the password in a file on the users home directory with 400 permissions and specify the path to that file using the --password-file argument, and is the preferred method of entering credentials. Sqoop will then read the password from the file and pass it to the MapReduce cluster using secure means with out exposing the password in the job configuration. The file containing the password can either be on the Local FS or HDFS. For example:
$ sqoop import --connect jdbc:mysql://database.example.com/employees \
How do you create an encrypted password file for sqoop "pass.enc"? All you did was post a link on how to save it in clear text, which is not secure enough for production environments. Thanks