Can someone explain what are the steps to configure external kerberized HDFS storage to use with Cloudera Manager? I have followed these steps: 1. krb5.conf -> /etc/ 2. hdfs-site.xml and core-site.xml -> /etc/hadoop 3. Proper keytab in reachable configuration used by Flume Agent configuration Nevertheless I am getting this error: Creating hdfs://NAMENODE/PATH/FILE.TMP process failed java.lang.IllegalArgumentException: java.net.UnknownHostException: NAMENODE Which I believe (maybe I am wrong) is a signal that Cloudera Manager is not using correct Hadoop config files (correct one should be used as I copied them into /etc/hadoop)... What is interesting: using any Hadoop binary I can successfully query HDFS from this machine, so it should be possible for Flume Agent on my Cloudera Manager to be able to put data on HDFS, also Flume agent is correctly configured as well. Is there a guideline somewhere to successfully configure access to external kerberized HDFS cluster, (not being part of by mine Cloudera Manager cluster) so it can be available for one of my Flume agent to push data into it? Thanks for answers!
... View more