Created 10-04-2021 02:11 AM
Hello,
we observe lot of annoying messages in kafka logs like this.
2021-09-01 17:09:47,435 WARN org.apache.hadoop.security.ShellBasedUnixGroupsMapping: unable to return groups for user OU=Dept,O=Company,C=DE,ST=Germany,CN=some_user
PartialGroupNameException The user name 'OU=Dept,O=Company,C=DE,ST=Germany,CN=some_user' is not found. id: OU=Dept,O=Company,C=DE,ST=Germany,CN=some_user: no such user
id: OU=Dept,O=Company,C=DE,ST=Germany,CN=some_user: no such user
at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.resolvePartialGroupNames(ShellBasedUnixGroupsMapping.java:291)
at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:215)
at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroupsSet(ShellBasedUnixGroupsMapping.java:123)
at org.apache.hadoop.security.Groups$GroupCacheLoader.fetchGroupSet(Groups.java:413)
at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:351)
at org.apache.hadoop.security.Groups$GroupCacheLoader.load(Groups.java:300)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)
at com.google.common.cache.LocalCache.get(LocalCache.java:3953)
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3976)
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4960)
at org.apache.hadoop.security.Groups.getGroupInternal(Groups.java:258)
at org.apache.hadoop.security.Groups.getGroupsSet(Groups.java:230)
at org.apache.hadoop.security.UserGroupInformation.getGroupsSet(UserGroupInformation.java:1760)
at org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1726)
at org.apache.ranger.audit.provider.MiscUtil.getGroupsForRequestUser(MiscUtil.java:587)
at org.apache.ranger.authorization.kafka.authorizer.RangerKafkaAuthorizer.authorize(RangerKafkaAuthorizer.java:155)
at org.apache.ranger.authorization.kafka.authorizer.RangerKafkaAuthorizer.authorize(RangerKafkaAuthorizer.java:135)
at kafka.security.authorizer.AuthorizerWrapper$$anonfun$authorize$1.apply(AuthorizerWrapper.scala:52)
at kafka.security.authorizer.AuthorizerWrapper$$anonfun$authorize$1.apply(AuthorizerWrapper.scala:50)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at kafka.security.authorizer.AuthorizerWrapper.authorize(AuthorizerWrapper.scala:50)
at kafka.server.KafkaApis.filterAuthorized(KafkaApis.scala:2775)
at kafka.server.KafkaApis.handleFetchRequest(KafkaApis.scala:639)
at kafka.server.KafkaApis.handle(KafkaApis.scala:128)
at kafka.server.KafkaRequestHandler.run(KafkaRequestHandler.scala:75)
at java.lang.Thread.run(Thread.java:748)
OU=Dept,O=Company,C=DE,ST=Germany,CN=some_user (values changed) is a self-signed certificate used as a client certificate for TLS based kafka connection.
The certificate exists, the name matches, and it was created accordingly in Ranger. The connection works actually.
We still don't undertstand where these warnings come from.
Cloudera Enterprise 7.2.4
Best Regards
Jaro
Created on 10-18-2021 07:13 AM - edited 10-18-2021 07:15 AM
Yes, there has to be a corresponding user in Ranger to authorise, it can not be just a certificate.
You can use Kafka SSL Authentication by setting it up for 2 way
and if you want Authorisation(via Ranger) add the user
The same is also discussed here - https://cwiki.apache.org/confluence/display/KAFKA/KIP-371%3A+Add+a+configuration+to+build+custom+SSL...
Hope this helps.
Created 10-05-2021 09:29 AM
I believe the user "some_user" exists but not the
'OU=Dept,O=Company,C=DE,ST=Germany,CN=some_user'
You should configure ssl.principal.mapping.rules to skip these warnings.
Created 10-05-2021 10:07 AM
Hi,
This sounds promissing.
Actually, I went accross principal mapping rules recently, but was quite unclear about the implications.
The fact is, the "some_user" is no posix, or LDAP user or what so ever. It exists only as certificate.
Also the user identifier in Ranger is like "OU=Dempt,O=Company,...". this is, how colleagues of mine have set up the policies.
Does your assumption mean, that every single client certificate should be backed by a posix user?
And what, if the user is an external party accessing the broker remotelly?
Best regards
Jaro
Created on 10-18-2021 07:13 AM - edited 10-18-2021 07:15 AM
Yes, there has to be a corresponding user in Ranger to authorise, it can not be just a certificate.
You can use Kafka SSL Authentication by setting it up for 2 way
and if you want Authorisation(via Ranger) add the user
The same is also discussed here - https://cwiki.apache.org/confluence/display/KAFKA/KIP-371%3A+Add+a+configuration+to+build+custom+SSL...
Hope this helps.
Created 10-21-2021 10:44 AM
@Jarinek, Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
Regards,
Vidya Sargur,