Created 07-10-2020 02:22 PM
We are trying to integrate Kafka and Ranger through kerberized env and receiving 401 error while Kafka trying to download policy from Ranger. Error which we received at Kafka is as below.
Kafka Logs while starting broker
Error getting policies. secureMode=true, user=kafka/<public DNS>@EXAMPLE.COM (auth:KERBEROS), response={"httpStatusCode":401,"statusCode":0}, serviceName=KafkaTest (org.apache.ranger.admin.client.RangerAdminRESTClient)
Configuration on Kafka server
COMPONENT_INSTALL_DIR_NAME=/home/ec2-user/kafka
POLICY_MGR_URL=https://<public DNS of Ranger>:6182
REPOSITORY_NAME=KafkaTest
Configuration at Ranger side as below:
Core-site.xml
<configuration>
<property>
<name>hadoop.security.authentication</name>
<value>kerberos</value>
<description>Set the authentication for the cluster.
Valid values are: simple or kerberos.</description>
</property>
</configuration>
Install.properties
spnego_principal=http/<public DNS>@EXAMPLE.COM
#spnego_principal=*
spnego_keytab=/home/ec2-user/spnego.service.keytab
token_valid=30
cookie_domain=<public DNS>
cookie_path=/
admin_principal=rangeradmin/<public DNS>@EXAMPLE.COM
admin_keytab=/home/ec2-user/rangeradmin.service.keytab
lookup_principal=rangerlookup/<public DNS>@EXAMPLE.COM
lookup_keytab=/home/ec2-user/rangerlookup.service.keytab
hadoop_conf=/etc/hadoop/conf
On Ranger-admin UI, we configured below properties
policy.download.auth.users=kafka
we also tried giving user as below
policy.download.auth.users=kafka/<public dns> (basically principal of Kafka broker)
Ranger-admin logs
"GET /service/plugins/secure/policies/download/KafkaTest?lastKnownVersion=51&lastActivationTime=1594414061001&pluginId=kafka@<internal-IP of Kafka>-KafkaTest&clusterName= HTTP/1.1" 401 - "-" "Java/1.8.0_242"
Please let us know what we have done wrong here. Thanks for all your help!
Created 07-11-2020 03:52 AM
Please have a look at this response by Vipin Rathor
Ranger Policy download with HTTP response 401
Hope that helps
Created on 07-12-2020 10:14 AM - edited 07-12-2020 10:15 AM
Created 07-13-2020 02:17 PM
I created the wrong principal for Spnego keytab. Principal needs to HTTP/test-dummy-X.openstacklocal@EXAMPLE.COM, not http/test-dummy-X.openstacklocal@EXAMPLE.COM. After making this change.
Created 07-13-2020 02:30 PM
The above solution is working when I do kinit of kafka service keytab. My Kafka jaas file is as below.
kafka_jaas.conf
KafkaServer {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/home/ec2-user/kafka.service.keytab"
principal="kafka/<public DNS>@EXAMPLE.COM";
};
Client {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/home/ec2-user/kafka.service.keytab"
principal="kafka/<public DNS>@EXAMPLE.COM";
};
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/home/ec2-user/kafka.service.keytab"
principal="kafka/<public DNS>@EXAMPLE.COM";
};
When I destroyed the ticket, I am getting below error.
WARN Error getting policies. secureMode=true, user=root (auth:KERBEROS), response={"httpStatusCode":401,"statusCode":0}, serviceName=KafkaTest (org.apache.ranger.admin.client.RangerAdminRESTClient)
After looking into error, I found the user is coming as root (highlighted above). I don't want to run kinit command explicitly and that's why I provided KafkaClient in my jaas file above. I did export KAFKA_OPTS and pass this jass file.
Can you please help me with what I have done wrong here. Thanks for all your help!
Created on 07-14-2020 11:53 AM - edited 07-14-2020 12:08 PM
Ambari explicitly configures a series of Kafka settings and creates a JAAS configuration file for the Kafka server. It is not necessary to modify these settings but check the below values in Server.properties
Listeners
listeners=SASL_PLAINTEXT://kafka01.example.com:6667
listeners=PLAINTEXT://your_host:9092, TRACE://:9091, SASL_PLAINTEXT://0.0.0.0:9093
Advertised.listeners
A list of listeners to publish to ZooKeeper for clients to use If advertised.listeners is not set, the value for listeners will be used
advertised.listeners=SASL_PLAINTEXT://kafka01.example.com:6667
Security.inter.broker.protocol
In a Kerberized cluster, brokers are required to communicate over SASL
security.inter.broker.protocol=SASL_PLAINTEXT
Principal.to.local.class
Transforms the Kerberos principals to their local Unix usernames.
principal.to.local.class=kafka.security.auth.KerberosPrincipalToLocal
super.users
Specifies user accounts that will acquire all cluster permissions these super users have all permissions that would otherwise need to be added through the kafka-acls.sh script
super.users=user:developer1;user:analyst1
JAAS Configuration File for the Kafka Server
Enabling Kerberos sets up a JAAS login configuration file for the Kafka server to authenticate the Kafka broker against Kerberos.
Usually in /usr/hdp/current/kafka-broker/config/kafka_server_jaas.conf
KafkaServer {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/home/ec2-user/kafka.service.keytab"
storeKey=true
useTicketCache=false
serviceName="kafka"
principal="kafka/<public_DNS@EXAMPLE.COM";
};
Client { // used for zookeeper connection
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/home/ec2-user/kafka.service.keytab"
storeKey=true
useTicketCache=false
serviceName="zookeeper"
principal="kafka/<public_DNS@EXAMPLE.COM";
};
Setting for the Kafka Producer
Ambari usually sets the below key-value pair in the server.properties file if nonexistent please add it:
security.protocol=SASL_PLAINTEXT
JAAS Configuration File for the Kafka Client
This file will be used for any client (consumer, producer) that connects to a Kerberos-enabled Kafka cluster.
The file is stored at:
/usr/hdp/current/kafka-broker/config/kafka_client_jaas.conf
Kafka client configuration with keytab, for producers:
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/home/ec2-user/kafka.service.keytab"
storeKey=true
useTicketCache=false
serviceName="kafka"
principal=""kafka/<public DNS>@EXAMPLE.COM";
};
Kafka client configuration without keytab, for producers:
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
renewTicket=true
serviceName="kafka";
};
Kafka client configuration for consumers:
KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useTicketCache=true
renewTicket=true
serviceName="kafka";
};
Check and set the Ranger policy permissions for kafka and ensure that all the Kafka keytab is executable by Kafka
Hope that helps