Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to use Kerberos Wizard changed “krb5-conf directory path” with Ambari Blueprints?

avatar
New Contributor

Hello,

we are using Amabri Blueprints for the node installations, it works fine with Kerberos if we use the regular folder '/etc/' for krb5.conf.

Unfortunately we already use the '/etc/' folder for another services krb5.conf file for other services than Hadoop (with different settings) and these services do manage and update their krb5.conf so we can’t just add our realm configurations.

We tried to use the Enable Kerberos Wizard with changing the “krb5-conf directory path” under “Advanced krb5-conf” in the second step of the wizard to e.g. /etc/hadoop, which does work. It will lead to the failure of the Kerberos Client tests in the next step of the wizard: kinit: Cannot find KDC for requested realm while getting initial credentials

So it seems that Ambari is not distributing the changed krb5 path to the clients.

Adding “export KRB5_CONFIG=/etc/hadoop/krb5.conf” to ambari-env.sh for the server and agents in the respective /var/lib/ambari-[server/agent] directories as well as -Djava.security.krb5.conf to the right path in the server ambari-env.sh also does not help.

Neither does adding “export KRB5_CONFIG=/etc/hadoop/krb5.conf” to the Hadoop-env-template (in the HDFS config options) or -Djava.security.krb5.conf=/etc/hadoop/krb5.conf to HADOOP_OPTS in the same file.

We also tried to use the API as described in the ambari wiki at https://cwiki.apache.org/confluence/display/AMBARI/Automated+Kerberizaton – with no success either. Here the Hadoop services fail to start after enabling Kerberos security with “kinit: Cannot find KDC for requested realm while getting initial credentials” as well.

Last, I should also add, that the Kerberos tools are working fine if used on the command line with KRB5_CONFIG set to /etc/hadoop/krb5.conf. So this doesn’t seem to be a Kerberos problem.

Can you suggest anything to solve this problem?

Thanks in advance 🙂

1 ACCEPTED SOLUTION

avatar
New Contributor

The solution that we currently found for the problem is the following:

For a krb5.conf residing in /etc/Hadoop/

Set “-Djava.security.krb5.conf=/etc/hadoop/krb5.conf” for ambari-server, in /var/lib/ambari-server/ambari-env.sh “export KRB5_CONFIG=/etc/hadoop/krb5.conf” in /var/lib/ambari-server/ambari-env.sh “export KRB5_CONFIG=/etc/hadoop/krb5.conf” in /var/lib/ambari-agent/ambari-env.sh

Set “-Djava.security.krb5.conf=/etc/hadoop/krb5.conf” for all services (e.g, in handoop-env, yarn-env, zookeeper-env, ams-env)

Then there is one important thing missing: the KRB5_CONFIG is not set for all kerberos tool (kinit,kadmin,…) calls.

As a simple workaround we did patch /usr/lib/python2.6/site-packages/resource_management/core/shell.py to set the environment variable if set in the calling environment: “env['KRB5_CONFIG'] = os.getenv('KRB5_CONFIG','/etc/krb5.conf')”

(e.g. sed -i -e "/# prepare command cmd/a\ env['KRB5_CONFIG'] = os.getenv('KRB5_CONFIG','/etc/krb5.conf')" /tmp/shell.py)

So far the results look promising. All services do get initialized and are running, I did tests using hdfs tools, yarn, hive and all went well. (Credit and thanks for the published solution to H.)

Can you see any problems with this approach?

View solution in original post

5 REPLIES 5

avatar

It seems like you found a limitation for which I am not sure there is an easy solution. I believe this is fixable, but the places where the krb5.conf file is used spans more than just the calls to kinit and klist. We need to make sure that the services that have built-in support for Kerberos can check alternate paths for the krb5.conf file.

avatar
New Contributor

The solution that we currently found for the problem is the following:

For a krb5.conf residing in /etc/Hadoop/

Set “-Djava.security.krb5.conf=/etc/hadoop/krb5.conf” for ambari-server, in /var/lib/ambari-server/ambari-env.sh “export KRB5_CONFIG=/etc/hadoop/krb5.conf” in /var/lib/ambari-server/ambari-env.sh “export KRB5_CONFIG=/etc/hadoop/krb5.conf” in /var/lib/ambari-agent/ambari-env.sh

Set “-Djava.security.krb5.conf=/etc/hadoop/krb5.conf” for all services (e.g, in handoop-env, yarn-env, zookeeper-env, ams-env)

Then there is one important thing missing: the KRB5_CONFIG is not set for all kerberos tool (kinit,kadmin,…) calls.

As a simple workaround we did patch /usr/lib/python2.6/site-packages/resource_management/core/shell.py to set the environment variable if set in the calling environment: “env['KRB5_CONFIG'] = os.getenv('KRB5_CONFIG','/etc/krb5.conf')”

(e.g. sed -i -e "/# prepare command cmd/a\ env['KRB5_CONFIG'] = os.getenv('KRB5_CONFIG','/etc/krb5.conf')" /tmp/shell.py)

So far the results look promising. All services do get initialized and are running, I did tests using hdfs tools, yarn, hive and all went well. (Credit and thanks for the published solution to H.)

Can you see any problems with this approach?

avatar
Master Mentor

@Claudio W has this been resolved? Please accept best answer or provide your own solution.

avatar
New Contributor

@Artem Ervits Well. the qualified engineer working on it fixed it with the solution I mentioned below. It's kind of fixing the Ambari Server Code - which hopefully could be integrated into the Ambari Source - anyone at HortonWorks is welcome to do so 🙂

avatar
Master Mentor

@Claudio W great, usually enhancements to code trickle down to trunk.