Support Questions

Find answers, ask questions, and share your expertise

Error in NiFi Flow when using ConsumeKafka_0_10

avatar
Expert Contributor

Team,

I am getting below error when i am trying to consume kafka messages from kerberized HDP cluster (where Kafka clients are installed ) from non-kerberized HDF (NIFI).

failed to process session due to org.apache.kafka.common.KafkaException: Failed to construct kafka consumer: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer

Please let me know if there are any steps that i need to follow .

Regards

Bharadwaj

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Bharadwaj Bhimavarapu

Whether you are using the ConsumeKafka or PublishKafka processors, if Kafka is kerberized you will need to setup a JAAS file in your NiFi which provides the keytab and principal used to establish that secured connection.

By default the /etc/krb5.conf will be used, but you can also tell NiFi to use a different krb5.conf file via a property in the nifi.properties (nifi.kerberos.krb5.file=).

You will need to create a JAAS file (example: kafka-jaas.conf) that contains the following (Update to use appropriate keytab and principal for your user):

KafkaClient{  
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=true
  storeKey=true
  keyTab="nifi.keytab"
  serviceName="kafka"
  principal="nifi@DOMAIN";
};

Add the following line to NiFi's bootstrap.conf file (make sure arg number 20 is not already being used, if so change to an unused number):

java.arg.20=-Djava.security.auth.login.config=/<path-to>/kafka-jaas.conf

Update the following configuration properties in your ConsumeKafka processor:

SecurityProtocol= SASL_PLAINTEXT
ServiceName= kafka

Basically you are setting up the Kafka client kerberos environment for your NiFi JVM.

If this is a NiFi cluster, you will need to to the above on every node.

You will need to restart NiFi for these changes to take affect.

Thanks,

Matt

View solution in original post

4 REPLIES 4

avatar
Master Mentor

@Bharadwaj Bhimavarapu

Whether you are using the ConsumeKafka or PublishKafka processors, if Kafka is kerberized you will need to setup a JAAS file in your NiFi which provides the keytab and principal used to establish that secured connection.

By default the /etc/krb5.conf will be used, but you can also tell NiFi to use a different krb5.conf file via a property in the nifi.properties (nifi.kerberos.krb5.file=).

You will need to create a JAAS file (example: kafka-jaas.conf) that contains the following (Update to use appropriate keytab and principal for your user):

KafkaClient{  
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=true
  storeKey=true
  keyTab="nifi.keytab"
  serviceName="kafka"
  principal="nifi@DOMAIN";
};

Add the following line to NiFi's bootstrap.conf file (make sure arg number 20 is not already being used, if so change to an unused number):

java.arg.20=-Djava.security.auth.login.config=/<path-to>/kafka-jaas.conf

Update the following configuration properties in your ConsumeKafka processor:

SecurityProtocol= SASL_PLAINTEXT
ServiceName= kafka

Basically you are setting up the Kafka client kerberos environment for your NiFi JVM.

If this is a NiFi cluster, you will need to to the above on every node.

You will need to restart NiFi for these changes to take affect.

Thanks,

Matt

avatar
Expert Contributor
@Matt Clarke

Thanks Matt .. it worked .

avatar
Rising Star

@Matt Clarke, Does the nifi.kerberos.krb5.file= should have the /<path-to>/kafka-jaas.conf?

avatar
Rising Star

@Matt, Even after i configure as per the instructions... Am still getting the same error...