Support Questions
Find answers, ask questions, and share your expertise

Kafka kerberization

Highlighted

Kafka kerberization

Hi everybody

We are currently using HDP and we want to implement a new cluster only having a kafka cluster and based on HDP 2.4.2. This cluster will be independant from our hadoop one and will be used for both hadoop and ELK (and maybe more later... that the reason why we want it to be isolated).

This kafka cluster needs to be securized with kerberos... But during this setting with ambari, the installer wants to create a principal for HDFS, which has no means for we do not have HDFS on this cluster. The installer fails because of a missing parameter ${hadoop-env/hdfs_user}. Is it a known bug that will be fixed in a next release ? Can we do that without installing HDFS (which works, it tested it...) and trying to unistall it cleanly ?

As a workaround can I add this variable ${hadoop-env/hdfs_user} within ambari or somewhere else to allow the installer to go further. I don't care if I have an additional and unuseful principal in kerberos. Regards

2 REPLIES 2
Highlighted

Re: Kafka kerberization

New Contributor

I have not tried this but need to do the same as well. Are you having this issue with blueprints or even using the webUI?

I am running into a similar issue that Ambari seems to expect every cluster to have HDFS/name node.

In my case I can create the kafka cluster via the webUI but get an error about missing name node when trying to deploy with Ambari blueprints

https://community.hortonworks.com/questions/37342/using-blueprints-for-cluster-without-a-namenode.ht...

Bert

Re: Kafka kerberization

Hi Bert

There is no problem to provision a kafka cluster from ambari. I did it from the webUI (I didn't test it with blue print yet), my cluster is just made of kaffa + zookeeper (managed as a dependance by ambari). I also added ambari metrics... and I don't have HDFS. So up to now it works fine... The only thing I saw is that if you also want ranger for kafka there is a dependance to HDFS (but I did not install Ranger). My post is about the addition of kerberos... Ambari want to create this principal :

hdp24.localdomain,/HDFS/NAMENODE/hdfs,${hadoop-env/hdfs_user}-hdp24@LOCALDOMAIN,USER,${hadoop-env/hdfs_user},/etc/security/keytabs/hdfs.headless.keytab,${hadoop-env/hdfs_user},r,hadoop,r,440,unknown

but because I don't have HDFS, the variable ${hadoop-env/hdfs_user} is not set and the installer fails. If I found a way to set this variable it should works, even if I don't care about this kerberos principal...

Keep in touch if you want.

Regards