Created 12-14-2015 10:54 AM
Specifications:
What would be a working example of a Spark streaming job that reads input from Kafka in a secure cluster under above conditions?
We made a Spark streaming job work that reads/writes into secure HBase and thought it couldn't be that different to do it with Kafka.
Created 12-15-2015 07:13 PM
spark streaming haven't yet enabled security for their kafka connector
Created 10-06-2016 03:18 AM
What is the issue you are facing after following the above documentation in HDP 2.4.2?
Created 04-20-2016 11:40 PM
If you want to test it, even on HDP 2.3.x (Spark 1.5.2) you can use the following library which has been compiled with the necessary changes in:
Created 04-28-2016 07:18 AM
Do you have the code with the changes that you did in that jar?
Created 04-21-2016 07:38 AM
We are using HDP-2.3.4.0 and use Kafka en SparkStreaming (Scala & Python) on a (Kerberos + Ranger) secured Cluster.
You need to add a jaas config location to the spark-sumbit command. We are using it in yarn-client mode. The kafka_client_jaas.conf file is send as a resource with the --files option and available in the yarn-container. We did not get ticket renewal working yet...
spark-submit (all your stuff) \ --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.conf=kafka_client_jaas.conf" \ --files "your_other_files,kafa_client_jaas.conf,serviceaccount.headless.keytab" \ (rest of your stuff) # --principal and --keytab does not work and conflict with --files keytab. # The jaas file will be placed in the yarn-containers by Spark. # The file contains the reference to the keytab-file and the principal for Kafka and ZooKeeper: KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useTicketCache=false useKeyTab=true principal="serviceaccount@DOMAIN.COM" keyTab="serviceaccount.headless.keytab" renewTicket=true storeKey=true serviceName="kafka"; }; Client { com.sun.security.auth.module.Krb5LoginModule required useTicketCache=false useKeyTab=true principal="serviceaccount@DOMAIN.COM" keyTab="serviceaccount.headless.keytab" renewTicket=true storeKey=true serviceName="zookeeper"; }
If you need more info, feel free to ask.
Greetings,
Alexander
Created 04-28-2016 07:20 AM
I have tried this approach and doesn't work 😞 . Spark doesn't throw any error, but doesn't read anything from the kerberized Kafka.
Created 04-28-2016 01:32 PM
After a hard day our team solved the issue and we have created a little Spark application that can read from a kerberized Kafka and launching using yarn-cluster mode. Attached you have the pom.xml needed (very important use the Kafka 0.9 dependency) and the java code.
It is very important to use the old Kafka API and to use PLAINTEXTSASL (and not SASL_PLAINTEXT, a value coming from the new Kafka API) when configuring the Kafka Stream.
The jaas.conf file is the same that @Alexander Bij shared. Be careful because the keytab path is not a local path, is the keytab where the executor store the file (by default you only have to indicated the name of the keytab).
KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useTicketCache=false useKeyTab=true principal="serviceaccount@DOMAIN.COM" keyTab="serviceaccount.headless.keytab" renewTicket=true storeKey=true serviceName="kafka"; }; Client { com.sun.security.auth.module.Krb5LoginModule required useTicketCache=false useKeyTab=true principal="serviceaccount@DOMAIN.COM" keyTab="serviceaccount.headless.keytab" renewTicket=true storeKey=true serviceName="zookeeper"; };
Kafka will use this information to retrieve the kerberos configuration required when connecting to ZK and to the broker. In order to do that, the executor will require the keytab file to get the kerberos token before connecting to the broker. That is the reason of sending also the keytab to the container when launching the application. Please, notice that the extraJavaOptions refers to the container local path (the file will be placed in the root folder of the temp configuration) but the file path will require the local path where you have the jaas file in the server that is starting up the application.
spark-submit (all your stuff) \ --conf "spark.executor.extraJavaOptions=-Djava.security.auth.login.config=kafka_client_jaas.conf" \ --files "your_other_files,kafka_client_jaas.conf,serviceaccount.headless.keytab" \ (rest of your stuff)
Also verify that the parameter where you indicate the jaas file is java.security.auth.login.config.
And that's all, following above instruction and using the attached code you will be able to read from kerberized Kafka using Spark 1.5.2.
In our Kerberized environment we have Ambari 2.2.0.0, HDP 2.3.4.0 and Spark 1.5.2. I hope this information help you.
Created 12-23-2016 09:09 PM
Hi javier,
Can you please upload the running code.
Inside pom.xml its giving below compilation error:
Project build error: Non-resolvable parent POM for com.accenture.ngap:spark-test:[unknown-version]: Failure to transfer com.accenture.ngap:parent:pom:0.0.1 from http://repo.hortonworks.com/content/repositories/releases/ was cached in the local repository, resolution will not be reattempted until the update interval of hortonworks has elapsed or updates are forced. Original error: Could not transfer artifact com.accenture.ngap:parent:pom:0.0.1 from/to hortonworks (http://repo.hortonworks.com/content/ repositories/releases/): connect timed out and 'parent.relativePath' points at wrong local POM
Created 05-13-2016 09:36 PM
Are there examples for HDP 2.3.2 - Spark Streaming (1.4.1) and Kafka(0.8.2) with java and kerberos environment
Created 12-16-2016 11:54 AM
Did you try with the code that I shared on https://community.hortonworks.com/comments/30193/view.html some comments above?
Created 08-18-2016 09:34 AM
HI Team ,
I am currently using Spark Streaming with Kafka without any SSL , it works fine .
But we need to enable SSL for Spark Streaming with Kafka.
Is this supported ?
I see spark 2.0 also supports kafka 0.8 version, but SSL is added in 0.9 kafka version .
Is there any way through which HDP is providing SSL enabled spark streaming with kafka ?
If yes . can any one share the document link or any info ?
Thanks