Member since
02-12-2018
5
Posts
4
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2814 | 08-18-2019 11:19 PM |
08-18-2019
11:19 PM
@saulo_sobreiro Are you using HDP 2.3? if so, you can try with HDP 3.1 which has KafkaStorageHandler. If you are using opensource hive 2.3 then try to use any version above 3.1.
... View more
08-14-2019
10:23 AM
yes it supports SSL, you can pass required params in TBLPROPERTIES Example: CREATE EXTERNAL TABLE kafka_hive_table_edgenode_SASL_SSL(`Country Name` string , `Language` string, `_id` struct<`$oid`:string,`name`:string>, `account` struct<`$accountstat`:string>) STORED BY 'org.apache.hadoop.hive.kafka.KafkaStorageHandler' TBLPROPERTIES("kafka.topic" = "kafka_hive_topic", "kafka.bootstrap.servers"="<kafka-broker>:<port>","kafka.consumer.security.protocol" = "SASL_SSL","kafka.consumer.ssl.truststore.location"="<truststore-path>","kafka.consumer.ssl.truststore.password"="<password>");
... View more
11-27-2018
03:59 PM
4 Kudos
Steps: When configuring a Kafka broker to use only SSL, you can have authentication and encryption by enabling 2-ways SSL by using parameter ssl.client.auth=required. Go to Ambari > Kafka > configs > custom Kafka-broker, add ssl.client.auth=required parameter. In this case certificate owner name (subject) works as username & authentication happens with this username. You have to add server certificate owner name from each broker as superuser (usernames separated by semicolon) so that each broker can access resources from other brokers. Example: In the case of two brokers you need to add two server certificate owner names to superuser.
Go to Ambari > kafka > configs > custom kafka-broker, configure super.users parameter
super.users = User:CN=user1,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx;User:CN=user2,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx You can create different client certificates for different users, if client certificate owner name is "CN=kafka,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx" then you can set your topic ACLs to allow this user to produce/consume data. For producing data you can execute below commands ./kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=<zkhost>:<port> --add --allow-principal User:"CN=kafka,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx" --cluster --producer --topic test
./kafka-console-producer.sh --broker-list <broker>:<port> --topic test --producer.config <client-ssl-path>/client-ssl.properties --security-protocol SSL client-ssl.properties: security.protocol=SSL
ssl.truststore.location=<client.truststore.jks>
ssl.truststore.password=<truststore-password>
ssl.keystore.location=<client.keystore.jks>
ssl.keystore.password=<keystore-password>
ssl.key.password=<key-password> For Consuming data you can execute below commands ./kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=<zkhost>:<port> --add --allow-principal User:"CN=kafka,OU=kafka,O=kafka,L=kafka,ST=kafka,C=xx" --group=* --consumer --topic test
./kafka-console-consumer.sh --bootstrap-server <broker>:<port> --topic test --security-protocol SSL --consumer.config <client-ssl-path>/client-ssl.properties --from-beginning If Username contains any other attribute apart from (CN, L, ST, O, OU, C, STREET), it will be emitted as Object Identifier (OID) which is difficult to identify. So you can implement CustomPrincipalBuilder class by implementing PrincipalBuilder Interface just to extract the simple username from certificate owner name. Build the custom jar & copy the jar to <kafka-broker>/libs/ folder in all brokers Configure below parameter in Ambari > kafka > configs > custom kafka-broker to load the custom class principal.builder.class=kafka.security.auth.CustomPrincipalBuilder
... View more
Labels: