Member since
04-04-2017
5
Posts
0
Kudos Received
0
Solutions
11-30-2017
06:58 PM
we are using NiFi to write/read data from Kafka and push that data to HBase. Kafka and HBase are on two different clusters and have two different Kerberos servers. We are facing an issue with connecting to both these systems at the same time. The Default Realm defined in krb5.conf is getting the preference and are able to connect to the system that's defined in default realm. we are giving the full usernames including the kerberos realms, but that's not helping us. Here's the detailed issue - Kafka realm - ABC.COM HBase realm - XYZ.com default_realm specified in krb5.conf - XYZ.com When trying to use Publish Kakfa with username kafkauser@ABC.com, we are getting error. On Kafka server, we are seeing an issue with user authentication saying that user kafkauser@XYZ.com/ABC.com is not found. If we change the default realm to ABC.com, publish kafka is working fine without any issues, but HBase is getting issues. If we remove the default realm all together, none of the processors work. If there any way to get around this default_realm issue?
... View more
Labels:
04-19-2017
01:27 AM
Hi @wynner, it was indeed an issue with the incorrect KDC server specified in the jaas.conf file. changing that fixed the error. thanks for looking into this
... View more
04-18-2017
12:57 PM
@wynner we use three different users. one user (ser_nifi) to start/run the nifi service, "nifi" user to connect to hdfs and "kafka" user to connect to kafka. we were able to do kinit using "kafka" user without an issue. in this same env, we were able to connect to hbase on the same server and write data into a table. only kafka connector is throwing this error.
... View more
04-17-2017
09:41 PM
Hello all, we are having issues connecting from NiFi (1.1) to Kafka (0.9) server using ConsumeKafka processor on kerberos enabled cluster. We were able to execute the same process in one environment (dev), but getting below error in test environment. "Caused by: org.apache.kafka.common.KafkaException:
javax.security.auth.login.LoginException: Could not login: the client is being
asked for a password, but the Kafka client code does not currently support
obtaining a password from the user. Make sure -Djava.security.auth.login.config
property passed to JVM and the client is configured to use a ticket cache
(using the JAAS configuration setting 'useTicketCache=true)'. Make sure you are
using FQDN of the Kafka broker you are trying to connect to. not available to
garner authentication information from the user" we have updated the bootstrap.conf file to use the correct jaas.conf file java.arg.15=-Djava.security.auth.login.config=/data/configuration_resources/jaas.conf updated jaas.conf file to use correct principal and keytab: KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/data/configuration_resources/kafka.keytab"
principal="kafka/kafka1.hostname.com@EXAMPLE.COM";
}; and updated client.properties file security.protocol=SASL_PLAINTEXT
sasl.kerberos.service.name=kafka added krb5.conf info in the nifi.properties file nifi.kerberos.krb5.file=/data/configuration_resources/krb5.conf what are the other files/configurations that we have to check to resolve this issue?
... View more
Labels:
04-04-2017
05:37 PM
I have an incoming JSON message coming onto NiFi and one of the values is an XML clob. I can get the attribute out and parse the XML using XMLTransform processor, but how can I merge this data back to the original JSON? I tried using merge processor, but have following concerns: 1) Merge processor is not able to concatenate 2 JSON files onto 1 JSON 2) When there are multiple source messages hiting NiFi, how can NiFi handle which flowfiles to merge
... View more
Labels: