Member since
10-17-2017
5
Posts
0
Kudos Received
0
Solutions
01-15-2019
07:20 AM
I am using HDP-2.6.3 cluster and trying to read from a kafka cluster and writing the data to a kerberos enabled kafka cluster and vice-versa. Although, I am able to do this when both the kafka clusters are kerberos enabled. Kindly let me know how we can access secure and non-secure components simultaneously in a spark application
... View more
Labels:
02-21-2018
05:31 AM
Thanks for you reply Harald. I am able to kinit with the provided keytab and principal.But I am using following code to get a ticket in the machine where spark executor is running. UserGroupInformation.setLoginUser(UserGroupInformation.loginUserFromKeytabAndReturnUGI("abc","~/abc.ketyab")); Also, as per your suggestion, I replaced _HOST with active namenode hostname, but still getting the same error as above.
... View more
02-20-2018
09:08 AM
I am using Spark standalone 1.6.x version to connect kerberos enabled hadoop 2.7.x in HDP cluster. The scenario works okay if i run the code in spark local mode.Facing below issue only in cluster and client mode JavaDStream<String> status = stream.map(new Function<String, String>() {
public String call(String arg0) throws Exception {
Configuration conf = new Configuration();
FileSystem fs = null;
conf.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");
conf.set("hadoop.security.authentication", "kerberos");
conf.set("dfs.namenode.kerberos.principal", "hdfs/_HOST@REALM");
UserGroupInformation.setConfiguration(conf);
UserGroupInformation.setLoginUser(UserGroupInformation.loginUserFromKeytabAndReturnUGI("abc","~/abc.ketyab"));
System.out.println("Logged in successfully.");
fs = FileSystem.get(new URI(activeNamenodeURI), conf);
FileStatus[] s = fs.listStatus(new Path("/"));
for (FileStatus status : s) {
System.out.println(status.getPath().toString());
}
return "success";
}
});
Getting below exception: User : abc@REALM (auth:KERBEROS) Caused by: java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "hostname1/0.0.0.0"; destination host is: "hostname2":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772) at org.apache.hadoop.ipc.Client.call(Client.java:1472) at org.apache.hadoop.ipc.Client.call(Client.java:1399) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) at com.sun.proxy.$Proxy44.create(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:295) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy45.create(Unknown Source) at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1725) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1668) at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1593) at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:397) at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:393) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:393) at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:337) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:908) at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:889) at com.abc.HDFSFileWriter.createOutputFile(HDFSFileWriter.java:354) ... 21 more Caused by: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS] at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:680) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:643) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:730) at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521) at org.apache.hadoop.ipc.Client.call(Client.java:1438) ... 43 more Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS] at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:172) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:396) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:553) at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:368) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:722) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:718) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415)
... View more
Labels:
10-23-2017
09:37 AM
I am connecting through a java class where -Djava.security.auth.login.config is set. In the logs, the storm principal it is connecting to is HTTPS/hostname@realm by default where as it is set to HTTP/hostname@realm in Ambari.
... View more
10-17-2017
08:39 PM
Getting below error when I am trying to connect to storm. I am using hdp2.6 and my storm services are ssl and kerberos enabled. storm_ui_principal_name is set to its default value HTTP/_HOST@REALM Found ticket for user@REALM to go to krbtgt/REALM@REALM expiring on Tue Oct 17 17:13:47 IST 2017 Entered Krb5Context.initSecContext with state=STATE_NEW Service ticket not found in the subject >>> Credentials acquireServiceCreds: same realm Using builtin default etypes for default_tgs_enctypes default etypes for default_tgs_enctypes: 18 17 16 23. >>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType >>> EType: sun.security.krb5.internal.crypto.Aes256CtsHmacSha1EType >>> KrbKdcReq send: kdc=hostname UDP:88, timeout=30000, number of retries =3, #bytes=631 >>> KDCCommunication: kdc=hostname UDP:88, timeout=30000,Attempt =1, #bytes=631 >>> KrbKdcReq send: #bytes read=176 >>> KdcAccessibility: remove hostname >>> KDCRep: init() encoding tag is 126 req type is 13 >>>KRBError: cTime is Fri Oct 05 05:08:11 IST 2007 1191541091000 sTime is Mon Oct 16 17:13:47 IST 2017 1508154227000 suSec is 841840 error code is 7 error Message is Server not found in Kerberos database cname is user@REALM sname is HTTPS/hostname@realm msgType is 30 KrbException: Server not found in Kerberos database (7) - LOOKING_UP_SERVER at sun.security.krb5.KrbTgsRep.<init>(KrbTgsRep.java:73) at sun.security.krb5.KrbTgsReq.getReply(KrbTgsReq.java:259) at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:270) at sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:302) at sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:120) at sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:458) at sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:693) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at sun.security.jgss.spnego.SpNegoContext.GSS_initSecContext(SpNegoContext.java:882) at sun.security.jgss.spnego.SpNegoContext.initSecContext(SpNegoContext.java:317) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
... View more
Labels: