Created 03-16-2018 12:02 AM
Hi All,
Product: Hortonworks Data Platform 2.6.3 Sandbox
I am having trouble logging storm events to hdfs when I enable ranger auditing for storm. I enabled kerberos with ambari, since it is noted in document that kerberos is required to install ranger storm-plugin. I kerberized the cluster, enabled storm plugin, and installed a storm policy like the one in attachment. Then I started and killed the storm topology with these commands:
- storm jar storm-starter-0.0.1-storm-0.9.0.1.jar storm.starter.WordCountTopology WordCount -c storm.starter.WordCountTopology WordCount
- storm kill WordCount
However, I don't see the log files in hdfs /ranger/audit/storm. (I can't view it directly from ranger UI because of an solr error, but that's another issue). In nimbus.log, I see this error:
2018-03-08 18:01:01.037 o.a.r.a.p.BaseAuditHandler [ERROR] Error writing to log file.
org.apache.hadoop.ipc.RemoteException: User: nimbus/sandbox-hdp.hortonworks.com@HORTONWORKS.COM is not allowed to impersonate storm
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1498) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.ipc.Client.call(Client.java:1398) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at com.sun.proxy.$Proxy54.getFileInfo(Unknown Source) ~[?:?]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:823) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_151]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_151]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_151]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_151]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185) ~[hadoop-common-2.7.3.2.6.3.0-235.jar:?]
I thought it was mapping issue between kerberos principal and linux user, so I added
"RULE:[2:$1@$0](nimbus@HORTONWORKS.COM)s/.*/storm/" to hadoop.security.auth_to_local and
<property>
<name>hadoop.proxyuser.storm.group</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.storm.hosts</name>
<value>sandbox-hdp.hortonworks.com</value>
</property>
to core-site.xml. But nothing is logged.
Any help or advice is appreciated. Thanks in advance.
Created 03-16-2018 08:40 PM
Looks like is nimbus/sandbox-hdp.hortonworks.com@HORTONWORKS.COM not getting translated into storm. You need to investigate why. Can you check your jaas config and auth to local rules again? Also the core-site.xml property should be hadoop.proxyuser.<component>.groups. Check for the typo, seems like "s" is missing in your config.
Created 03-16-2018 12:51 AM
Please see this doc - https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.4/bk_security/content/manually_updating_ambar... - which mentions the below....
For Storm, link to/etc/hadoop/conf/core-site.xml
under/usr/hdp/<version>/storm/extlib-daemon/ranger-storm-plugin-impl/conf
Created 03-16-2018 06:40 PM
@vperiasamy Thank you very much for your response.
I already see a core-site.xml file in that directory.
I moved it out and linked /etc/hadoop/conf/core-site.xml as described in the doc.
Restarted hdfs, ranger, and storm, but I still see the same error.
Is there any else I could've missed?
Created 03-16-2018 08:40 PM
Looks like is nimbus/sandbox-hdp.hortonworks.com@HORTONWORKS.COM not getting translated into storm. You need to investigate why. Can you check your jaas config and auth to local rules again? Also the core-site.xml property should be hadoop.proxyuser.<component>.groups. Check for the typo, seems like "s" is missing in your config.
Created 03-16-2018 10:06 PM
Turns out the typo was the problem.
That's just so silly....
@vperiasamy thank you so much.