Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Error in NiFi Flow when using ConsumeKafka_0_10

Rising Star

Team,

I am getting below error when i am trying to consume kafka messages from kerberized HDP cluster (where Kafka clients are installed ) from non-kerberized HDF (NIFI).

failed to process session due to org.apache.kafka.common.KafkaException: Failed to construct kafka consumer: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer

Please let me know if there are any steps that i need to follow .

Regards

Bharadwaj

1 ACCEPTED SOLUTION

Master Guru

@Bharadwaj Bhimavarapu

Whether you are using the ConsumeKafka or PublishKafka processors, if Kafka is kerberized you will need to setup a JAAS file in your NiFi which provides the keytab and principal used to establish that secured connection.

By default the /etc/krb5.conf will be used, but you can also tell NiFi to use a different krb5.conf file via a property in the nifi.properties (nifi.kerberos.krb5.file=).

You will need to create a JAAS file (example: kafka-jaas.conf) that contains the following (Update to use appropriate keytab and principal for your user):

KafkaClient{  
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=true
  storeKey=true
  keyTab="nifi.keytab"
  serviceName="kafka"
  principal="nifi@DOMAIN";
};

Add the following line to NiFi's bootstrap.conf file (make sure arg number 20 is not already being used, if so change to an unused number):

java.arg.20=-Djava.security.auth.login.config=/<path-to>/kafka-jaas.conf

Update the following configuration properties in your ConsumeKafka processor:

SecurityProtocol= SASL_PLAINTEXT
ServiceName= kafka

Basically you are setting up the Kafka client kerberos environment for your NiFi JVM.

If this is a NiFi cluster, you will need to to the above on every node.

You will need to restart NiFi for these changes to take affect.

Thanks,

Matt

View solution in original post

4 REPLIES 4

Master Guru

@Bharadwaj Bhimavarapu

Whether you are using the ConsumeKafka or PublishKafka processors, if Kafka is kerberized you will need to setup a JAAS file in your NiFi which provides the keytab and principal used to establish that secured connection.

By default the /etc/krb5.conf will be used, but you can also tell NiFi to use a different krb5.conf file via a property in the nifi.properties (nifi.kerberos.krb5.file=).

You will need to create a JAAS file (example: kafka-jaas.conf) that contains the following (Update to use appropriate keytab and principal for your user):

KafkaClient{  
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=true
  storeKey=true
  keyTab="nifi.keytab"
  serviceName="kafka"
  principal="nifi@DOMAIN";
};

Add the following line to NiFi's bootstrap.conf file (make sure arg number 20 is not already being used, if so change to an unused number):

java.arg.20=-Djava.security.auth.login.config=/<path-to>/kafka-jaas.conf

Update the following configuration properties in your ConsumeKafka processor:

SecurityProtocol= SASL_PLAINTEXT
ServiceName= kafka

Basically you are setting up the Kafka client kerberos environment for your NiFi JVM.

If this is a NiFi cluster, you will need to to the above on every node.

You will need to restart NiFi for these changes to take affect.

Thanks,

Matt

Rising Star
@Matt Clarke

Thanks Matt .. it worked .

@Matt Clarke, Does the nifi.kerberos.krb5.file= should have the /<path-to>/kafka-jaas.conf?

@Matt, Even after i configure as per the instructions... Am still getting the same error...

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.