Support Questions
Find answers, ask questions, and share your expertise

Re: SQLServer Sqoop Import Integrated Security Kerberos - Failed to find any Kerberos ticket.

Based on the behavior, it seems to be the case that the execution environment for sqoop list-databases is local to the client host and so has access to the user's ticket cache on that host. Sqoop is a wrapper for the creation and submission of MapReduce jobs (and everything here is with respect to MRv2 on YARN).

When sqoop import is invoked from the client, resources for the associated Map tasks are allocated by the YARN Resource Manager in the form of YARN containers that are scheduled to run on the cluster's worker nodes. These containers are not forwarded Kerberos tickets, instead they are given delegation tokens (requested by the client and associated with the container's launch context). More details are available here and here.

These delegation tokens can be used to access Hadoop resources, such as HDFS, under the authorization context of the client. My speculation is that since the execution context for the Map tasks does not contain the original TGT, this poses an issue for authenticating via Kerberos to services outside of Hadoop that don't interact with Hadoop delegation tokens, like SQL Server.

Re: SQLServer Sqoop Import Integrated Security Kerberos - Failed to find any Kerberos ticket.

Explorer

I have found a solution to this provided by another user here: https://community.hortonworks.com/questions/20719/sqoop-to-sql-server-with-integrated-security.html

Basically if you switch to the jtds driver which you can download here: http://jtds.sourceforge.net/

Per Rajendra Manjunath

"

Sqoop SQL Server data import to HDFS worked with manual parametric the authentication(using windows credential) with added parameter on the SQL Server JDBC driver, as integrated security is not supported by the SQL driver as of now due to the Kerberos authentication(Delegated tokens distributed over cluster while running MR job).

So we need to pass the windows authentication with password and with the integrated security disabled mode to import the data to the system. As normal SQL server driver does not support, so I had used the jtds.jar and the different driver class to pull the data to the Hadoop Lake.

Sample Command I tried on the server as follows,

sqoop import --table Table1 --connect "jdbc:jtds:sqlserver://<Hostname>:<Port>;useNTLMv2=true;domain=<WindowsDomainName>;databaseName=XXXXXXXXXXXXX" \

--connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX --password 'XXXXXXX' \

--verbose --target-dir /tmp/33 -m 1 -- --schema dbo

"

Here are some examples that worked for me:

# List databases

sqoop list-databases --connect "jdbc:jtds:sqlserver://databasehostname.yourdomain.com:1433;useNTLMv2=true;domain=myactivedirectorydomain.com" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX -P

# List tables

sqoop list-tables --connect "jdbc:jtds:sqlserver://databasehostname.yourdomain.com:1433;useNTLMv2=true;domain=myactivedirectorydomain.com;databaseName=DATABASENAMEHERE" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username jmiller.admin -P

# Pull data example

sqoop import --table TABLENAMEHERE --connect "jdbc:jtds:sqlserver://databasehostname.yourdomain.com:1433;useNTLMv2=true;domain=myactivedirectorydomain.com;databaseName=DATABASENAMEHERE" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX -P --fields-terminated-by '\001' --target-dir /user/XXXXX/20170313 -m 1 -- --schema dbo

Note* In the above example you need to change the username to your username and database name in the list-tables or pull to the one you need (note the AD account you use will require access to the data).