Member since
01-11-2016
5
Posts
0
Kudos Received
0
Solutions
05-29-2021
02:37 PM
@Onedile wrote: Yes this is possible. You need to kinit with the username that has been granted access to the SQL server DB and tables. integrated security passes your credentials to the SQL server using kerberos "jdbc:sqlserver://sername.domain.co.za:1433;integratedSecurity=true;databaseName=SCHEMA;authenticationScheme=JavaKerberos;" This worked for me. It doesn't work, it's still facing issue with the latest MSSQL JDBC driver as the Kerberos tokens are lost when the mappers spawn (as the YARN transitions the job to its internal security subsystem) 21/05/29 19:00:40 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1616335290043_2743822
21/05/29 19:00:40 INFO mapreduce.JobSubmitter: Executing with tokens: [Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for c795701: HDFS_DELEGATION_TOKEN owner=c795701@XX.XXXX.XXXXXXX.COM, renewer=yarn, realUser=, issueDate=1622314832608, maxDate=1622919632608, sequenceNumber=29194128, masterKeyId=1856)]
21/05/29 19:01:15 INFO mapreduce.Job: Task Id : attempt_1616335290043_2743822_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: com.microsoft.sqlserver.jdbc.SQLServerException: Integrated authentication failed. ClientConnectionId:53879236-81e7-4fc6-88b9-c7118c02e7be
Caused by: java.security.PrivilegedActionException: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) Use the jtds driver as recommended here
... View more
06-02-2016
07:43 PM
"Does this mean that my user (later on it will be a system user) needs to have a keytab created on the linux file system and distributed to all the nodes?" You would put the keytab in HDFS with access rights for only the user and use the oozie files tag to load it to your temp execution directory, https://oozie.apache.org/docs/3.2.0-incubating/WorkflowFunctionalSpec.html#a3.2.7_Java_Action "Moreover, it might not be a great option, but isn't this authentication possible using only username/password ?" To do this you need PAM or LDAP authentication, thats why I mentioned it :-). You can either hardcode it or do the same thing we discussed above with a password file in hdfs. For this you can set access rights. "Option 3 - I'm using the current mechanism as it is the only one I found some examples on the net. I checked shortly on PAM/LDAP, I'm not sure yet if that will require some changes from the Hadoop cluster side. If not, I'll be happy to try it" https://community.hortonworks.com/articles/591/using-hive-with-pam-authentication.html 🙂
... View more