Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Integrate kerberos with kafka in hbase coprocessor

avatar
New Contributor

hi:

 

I want to integrate kerberos with kafka in hbase coprocessor and I could not autenticate inside the application. I am getting this error:

 

Caused by: java.lang.IllegalArgumentException: You must pass java.security.auth.login.config in secure mode.
at org.apache.kafka.common.security.kerberos.Login.login(Login.java:289)
at org.apache.kafka.common.security.kerberos.Login.<init>(Login.java:104)
at org.apache.kafka.common.security.kerberos.LoginManager.<init>(LoginManager.java:44)
at org.apache.kafka.common.security.kerberos.LoginManager.acquireLoginManager(LoginManager.java:85)
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:55)
... 14 more

 

 

This is part of my coprocessor postput code:

 

 

SparkConf conf = new SparkConf().setAppName("Coprocessor").setMaster("local[1]");
JavaSparkContext sc = new JavaSparkContext(conf);
sc.getConf().set("spark.yarn.principal","user@EXAMPLE.COM");
sc.getConf().set("spark.yarn.keytab", "/home/user/user.keytab");
sc.getConf().set("spark.yarn.credentials.file", "credential_file");

Properties props = new Properties();
props.put("bootstrap.servers", "server.com:9092");
props.put("client.id", "client-id-coprocessor");
props.put("key.serializer", StringSerializer.class.getName());
props.put("value.serializer", StringSerializer.class.getName());
props.put("security.protocol","SASL_PLAINTEXT");
props.put("sasl.kerberos.service.name", "kafka");

KafkaProducer<String, String> producer = new KafkaProducer<String, String>(props);
ProducerRecord<String, String> message = new ProducerRecord<String, String>(KAFKA_TOPIC,"key", "this is a simple message");
producer.send(message);
producer.close();

1 ACCEPTED SOLUTION

avatar
Explorer
Thanks, It help me to solve my problem!

View solution in original post

5 REPLIES 5

avatar
Expert Contributor

Regarding how to make Spark work with Kerberos enabled Kafka, please refer to Cloudera engineering blog:

 

https://blog.cloudera.com/blog/2017/05/reading-data-securely-from-apache-kafka-to-apache-spark/

 

There are explainations on prerequisites, solution and sample code.

avatar
Explorer
Thanks, It help me to solve my problem!

avatar
New Contributor

Hi Flore,

 

We are blocked due to Co-Processor issue in Kerberos environment. It would be great if you can explain bit detail about the steps you have done for running co-processor in Kerberos Environment.

 

Below are the few points.

  1. a) Which Keytab you have used, whether CM generated keytab or user keytab generated by you?
  2. b) Path of your jaas.conf and keytab for Kafka?
  3. c) How Kafka Kerberos configuration parameters set?

 

I am able to execute my coprocessor code in Non Kerberos cluster but in getting error  "org.apache.kafka.common.KafkaException: Jaas configuration not found" while running the code inside the co-processor in Kerberos environment.

 

Thanks in advance.

Regards

Sumanta

avatar
Contributor

We got the working pointing to the HBase keytab, ensuring that the jaas.conf exists on each master/region server.

  And my coprocessor produces messages to a secure Kafka topic.

 

Of course you need to have the master/region server pointing to the jaas.conf file...

 

ie. Master and region Java configuration...

-Djava.security.auth.login.config=/etc/hbase/jaas.conf

 

avatar
New Contributor
Hi Suku:


I response some of your questions:

a) Which Keytab you have used, whether CM generated keytab or user keytab
generated by you?

I used kafka.keytab

b) Path of your jaas.conf and keytab for Kafka?

Path of kafka.keytab in /etc/security/keytabs/

c) How Kafka Kerberos configuration parameters set?

The following is the configuration of Kafka parameters and the the form to
use the jaas parameter.


Properties props = new Properties();
props.put("bootstrap.servers", "xxxx:9092,xxx:9092");
props.put("client.id", "client-id-coprocessor ");
props.put("key.serializer", StringSerializer.class.getName());
props.put("value.serializer", StringSerializer.class.getName());
props.put("security.protocol", "SASL_PLAINTEXT");
props.put("sasl.kerberos.service.name", "kafka");
props.put("sasl.jaas.config",
"com.sun.security.auth.module.Krb5LoginModule required \n" +
"useKeyTab=true \n" +
"storeKey=true \n" +
"keyTab=\"/etc/security/keytabs/kafka.keytab\" \n" +
"principal=\"kafka/nodo@REALM\";");
KafkaProducer producer = new KafkaProducerString>(props);


Remember sometimes you will need reboot your hbase service for deploy your
coprocessor.


I hope I will help you.


Florentino