Member since
12-11-2015
3
Posts
4
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6189 | 12-23-2015 09:35 AM |
02-02-2016
07:38 PM
@Artem Ervits Well. the qualified engineer working on it fixed it with the solution I mentioned below. It's kind of fixing the Ambari Server Code - which hopefully could be integrated into the Ambari Source - anyone at HortonWorks is welcome to do so 🙂
... View more
12-23-2015
09:35 AM
3 Kudos
The solution that we currently found for the problem is the following: For a krb5.conf residing in /etc/Hadoop/ Set “-Djava.security.krb5.conf=/etc/hadoop/krb5.conf” for ambari-server, in /var/lib/ambari-server/ambari-env.sh
“export KRB5_CONFIG=/etc/hadoop/krb5.conf” in /var/lib/ambari-server/ambari-env.sh
“export KRB5_CONFIG=/etc/hadoop/krb5.conf” in /var/lib/ambari-agent/ambari-env.sh Set “-Djava.security.krb5.conf=/etc/hadoop/krb5.conf” for all services (e.g, in handoop-env, yarn-env, zookeeper-env, ams-env) Then there is one important thing missing: the KRB5_CONFIG is not set for all kerberos tool (kinit,kadmin,…) calls. As a simple workaround we did patch /usr/lib/python2.6/site-packages/resource_management/core/shell.py to set the environment variable if set in the calling environment:
“env['KRB5_CONFIG'] = os.getenv('KRB5_CONFIG','/etc/krb5.conf')” (e.g. sed -i -e "/# prepare command cmd/a\ env['KRB5_CONFIG'] = os.getenv('KRB5_CONFIG','/etc/krb5.conf')" /tmp/shell.py) So far the results look promising. All services do get initialized and are running, I did tests using hdfs tools, yarn, hive and all went well. (Credit and thanks for the published solution to H.) Can you see any problems with this approach?
... View more
12-11-2015
08:19 PM
1 Kudo
Hello, we are using Amabri Blueprints for the node
installations, it works fine with Kerberos if we use the regular folder '/etc/'
for krb5.conf. Unfortunately we already use the '/etc/' folder for
another services krb5.conf file for other services than Hadoop (with different
settings) and these services do manage and update their krb5.conf so we can’t
just add our realm configurations. We tried to use the Enable Kerberos Wizard with changing the “krb5-conf directory path” under “Advanced krb5-conf” in the second step of the wizard to e.g. /etc/hadoop, which does work. It will lead to the failure of the Kerberos Client tests in the next step of the wizard: kinit: Cannot find KDC for requested realm while getting initial credentials So it seems that Ambari is not distributing the changed krb5 path to the clients. Adding “export KRB5_CONFIG=/etc/hadoop/krb5.conf” to ambari-env.sh for the server and agents in the respective /var/lib/ambari-[server/agent] directories as well as -Djava.security.krb5.conf to the right path in the server ambari-env.sh also does not help. Neither does adding “export KRB5_CONFIG=/etc/hadoop/krb5.conf” to the Hadoop-env-template (in the HDFS config options) or -Djava.security.krb5.conf=/etc/hadoop/krb5.conf to HADOOP_OPTS in the same file. We also tried to use the API as described in the ambari wiki at https://cwiki.apache.org/confluence/display/AMBARI/Automated+Kerberizaton – with no success either. Here the Hadoop services fail to start after enabling Kerberos security with “kinit: Cannot find KDC for requested realm while getting initial credentials” as well. Last, I should also add, that the Kerberos tools are
working fine if used on the command line with KRB5_CONFIG set to
/etc/hadoop/krb5.conf. So this doesn’t seem to be a Kerberos problem. Can you suggest anything to solve this problem? Thanks in advance 🙂
... View more
Labels:
- Labels:
-
Apache Ambari