Created 03-19-2018 03:05 PM
hi Guys,
I am trying to export data from HDFS to SQL server, it works fine if I provide the password as argument. However, if I provide password file then it fails.
sqoop export --connect "jdbc:sqlserver://abc.com:58850;databaseName=IKB_PROD;schema=dbo;" --table "SQOOP_TEST_SMALL" --export-dir /tmp/SQOOP_TEST_SMALL_20180101_010101 --username HADOOP_USR --password-file /user/gaurang.shah/sqlserver_password.pass --verbose Warning: /usr/hdp/2.5.3.0-37/accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 18/03/19 14:30:17 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.3.0-37 18/03/19 14:30:17 DEBUG tool.BaseSqoopTool: Enabled debug logging. 18/03/19 14:30:17 DEBUG password.FilePasswordLoader: Fetching password from specified path: /user/gaurang.shah/sqlserver_password.pass 18/03/19 14:30:18 DEBUG sqoop.ConnFactory: Loaded manager factory: org.apache.sqoop.manager.oracle.OraOopManagerFactory 18/03/19 14:30:18 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory 18/03/19 14:30:18 DEBUG sqoop.ConnFactory: Trying ManagerFactory: org.apache.sqoop.manager.oracle.OraOopManagerFactory 18/03/19 14:30:18 DEBUG oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop can be called by Sqoop! 18/03/19 14:30:18 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory 18/03/19 14:30:18 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:sqlserver: 18/03/19 14:30:18 INFO manager.SqlManager: Using default fetchSize of 1000 18/03/19 14:30:18 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.SQLServerManager@6f0628de 18/03/19 14:30:18 INFO tool.CodeGenTool: Beginning code generation 18/03/19 14:30:18 DEBUG manager.SqlManager: Execute getColumnInfoRawQuery : SELECT t.* FROM [SQOOP_TEST_SMALL] AS t WHERE 1=0 18/03/19 14:30:18 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection. 18/03/19 14:30:18 ERROR manager.SqlManager: Error executing statement: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'HADOOP_USR'. com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'HADOOP_USR'. at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:196) at com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:246)
Created 03-19-2018 03:59 PM
Sqoop expects the password file is located on HDFS. Can you move the file to a directory located in HDFS Try to specify that path and file name eg
/user/gaurang/sqlpwd.pass
With the correct permissions. A better one to use will be password-alias which makes use of the hadoop credential provider to store the password instead of the clear text password. This option works with HDP 2.2 or later. Please see https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_connecting_to_a_database_server
Created 03-19-2018 06:16 PM
The issue was with the /n character at the end of the file. It works perfectly without issue on Netezza however it creates an issue on SQL Server
following command to create new password file resolved the issue.
tr -d '\n' < sqlserver_password.pass > sqlserver.pass