Created 09-18-2018 07:03 AM
Hi All,
We are trying to push data into kerberized kafka using external non-kerberized nifi.
The nifi node was previously configured in /etc/krb5.conf to use ABC.COM in realm and was working fine.
Now when we want to change the pointing of realm in same file to XYZ.COM we are getting following error :
Please let me know if any service needs to be restarted while using kerberized cluster from non-kerberized node.
2018-09-17 17:24:58,905 ERROR [Timer-Driven Process Thread-144] o.a.n.p.kafka.pubsub.PublishKafka_0_10 PublishKafka_0_10[id=e01735cf-1a47-12b1-8151-82518eab4545] PublishKafka_0_10[id=e01735cf-1a47-12b1-8151-82518eab4545] failed to process session due to org.apache.kafka.common.KafkaException: Failed to construct kafka producer: {}org.apache.kafka.common.KafkaException: Failed to construct kafka producer at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:342) at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:166) at org.apache.nifi.processors.kafka.pubsub.PublisherPool.createLease(PublisherPool.java:61) at org.apache.nifi.processors.kafka.pubsub.PublisherPool.obtainPublisher(PublisherPool.java:56) at org.apache.nifi.processors.kafka.pubsub.PublishKafka_0_10.onTrigger(PublishKafka_0_10.java:312) at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1122) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147) at org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:128) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Cannot locate KDC at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:94) at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:93) at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:51) at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:84) at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:305) ... 16 common frames omittedCaused by: javax.security.auth.login.LoginException: Cannot locate KDC at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:804) at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617) at sun.reflect.GeneratedMethodAccessor1592.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) at javax.security.auth.login.LoginContext.login(LoginContext.java:587) at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:58) at org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:109) at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:55) at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:83) at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86) ... 20 common frames omittedCaused by: sun.security.krb5.KrbException: Cannot locate KDC at sun.security.krb5.Config.getKDCList(Config.java:1084) at sun.security.krb5.KdcComm.send(KdcComm.java:218) at sun.security.krb5.KdcComm.send(KdcComm.java:200) at sun.security.krb5.KrbAsReqBuilder.send(KrbAsReqBuilder.java:316) at sun.security.krb5.KrbAsReqBuilder.action(KrbAsReqBuilder.java:361) at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:776) ... 36 common frames omittedCaused by: sun.security.krb5.KrbException: Generic error (description in e-text) (60) - Unable to locate KDC for realm XYZ.COM at sun.security.krb5.Config.getKDCFromDNS(Config.java:1181) at sun.security.krb5.Config.getKDCList(Config.java:1057)... 41 common frames omitted
Created 09-18-2018 08:28 PM
Thats exactly how it was designed to function.When you create a Kerberos database kdb5_util create -s and generate keytabs, you are creating a something a private and public key it's the DNA database the keytab is like a biometric passport (keytab) that you present to the airport and its check against the passport database (kdc) to check whether it's really you or someone's passport that's exactly what's happening !!!! The KDC database is checking the keytabs against the ABC.COM yet you are trying to present a wrong passport. So there is no way your Kafka is going to function unless.
HTH
,Whe
Created 09-18-2018 07:56 AM
Hi @Rohit Sharma,
The "krb5.conf" file shouldn't be changed depending on the REALM, you can configure more than one REALM inside.
https://web.mit.edu/kerberos/krb5-devel/doc/admin/realm_config.html
Check if the keytab is pointing to the correct REALM.
Hope it helps!
Gonçalo
Created 09-18-2018 08:00 AM
Also, in the log you have "Unable to locate KDC for realm XYZ.COM" make sure that REALM exists and if the mappings are correct inside the file.
Created 09-18-2018 10:45 AM
@Gonçalo Cunha,
Thanks for your response.
the current krb5.conf looks like:
[libdefaults] renew_lifetime = 7d forwardable = true default_realm = XYZ.COM ticket_lifetime = 24h dns_lookup_realm = false dns_lookup_kdc = false default_ccache_name = /tmp/krb5cc_%{uid} #default_tgs_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5 #default_tkt_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5 [domain_realm] .example.com = XYZ.COM example.com = XYZ.COM [logging] default = FILE:/var/log/krb5kdc.log admin_server = FILE:/var/log/kadmind.log kdc = FILE:/var/log/krb5kdc.log [realms] XYZ.COM = { admin_server = FQDN kdc = FQDN }
The principal pointing to this is like:
username/hostname@XYZ.COMKeytab:
username.service.keytab
Created 09-18-2018 01:12 PM
The krb5.conf file seems to be ok for the XYZ.COM realm, however you will not be able to authenticate or validate principals for the ABC.COM realm. I hope that this is what was desired.
Also, I assume that "FQDN" in the file is masked and is actually a FQDN that points to the KDC for the realm of XYZ.COM.
How was the change to the realm name in the krb5.conf file performed? Did you manually edit the krb5.conf file or manually edit the krb5.conf template in Ambari? In either case, I do not think that Ambari allows for the realm name to just be changed for a cluster where Kerberos is enabled. Maybe if you were manually managing the Kerberos identities this would be ok, but I would suggest that you disable Kerberos and then reenable Kerberos via Ambari using the new KDC details.
At the very least, you should restart all of the services to ensure they picked up any principal changes or Kerberos infrastrucure changes - like the new krb5.conf file.
Created 09-18-2018 08:28 PM
Thats exactly how it was designed to function.When you create a Kerberos database kdb5_util create -s and generate keytabs, you are creating a something a private and public key it's the DNA database the keytab is like a biometric passport (keytab) that you present to the airport and its check against the passport database (kdc) to check whether it's really you or someone's passport that's exactly what's happening !!!! The KDC database is checking the keytabs against the ABC.COM yet you are trying to present a wrong passport. So there is no way your Kafka is going to function unless.
HTH
,Whe