Member since
09-29-2015
362
Posts
242
Kudos Received
63
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1821 | 03-14-2019 01:00 PM | |
| 2109 | 01-23-2019 04:19 PM | |
| 8611 | 01-15-2019 01:59 PM | |
| 6315 | 01-15-2019 01:57 PM | |
| 15283 | 12-06-2018 02:01 PM |
06-29-2016
11:03 PM
2 Kudos
@ARUNKUMAR RAMASAMY Assume by "secondary KDC", you either mean in a master/slave relationship or a trust relationship (where each KDC hosts its own realm)
If this is a master/slave relationship, you need to edit that krb5.conf template (under the Advanced krb5-conf tab on the Kerberos service config page) to specify the additional KDC host and (optionally) the master KDC:
[libdefaults]
renew_lifetime = 7d
forwardable = true
default_realm = {{realm}}
ticket_lifetime = 24h
dns_lookup_realm = false
dns_lookup_kdc = false
#default_tgs_enctypes = {{encryption_types}}
#default_tkt_enctypes = {{encryption_types}}
{% if domains %}
[domain_realm]
{% for domain in domains.split(',') %}
{{domain}} = {{realm}}
{% endfor %}
{% endif %}
[logging]
default = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind.log
kdc = FILE:/var/log/krb5kdc.log
[realms]
{{realm}} = {
admin_server = {{admin_server_host|default(kdc_host, True)}}
kdc = fqdn.slave.kdc
kdc = {{kdc_host}}
master_kdc = {{kdc_host}}
}
{# Append additional realm declarations below #}
Note the addition of the kdc and master_kdc entries in the realm definition. For the additional kdc entry, "fqdn.slave.kdc" should be change to the FQDN (and optionally port) of the slave KDC.
If this is a trust relationship, then you need to add the additional realm to the krb5.conf template (under the Advanced krb5-conf tab on the Kerberos service config page). [libdefaults]
renew_lifetime = 7d
forwardable = true
default_realm = {{realm}}
ticket_lifetime = 24h
dns_lookup_realm = false
dns_lookup_kdc = false
#default_tgs_enctypes = {{encryption_types}}
#default_tkt_enctypes = {{encryption_types}}
{% if domains %}
[domain_realm]
{% for domain in domains.split(',') %}
{{domain}} = {{realm}}
{% endfor %}
{% endif %}
[logging]
default = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind.log
kdc = FILE:/var/log/krb5kdc.log
[realms]
{{realm}} = {
admin_server = {{admin_server_host|default(kdc_host, True)}}
kdc = {{kdc_host}}
}
{# Append additional realm declarations below #}
ADDITONAL.REALM = {
admin_server = FQDN.admin.server
kdc = fqdn.kdc
}
Note the additional realm, named "ADDITIONAL.REALM", which should be changed to the actual realm name. Also the admin_server and kdc values need to set appropriately. You will also want to add the additional realm to the "Additional Realms" value in the Kerberos admin page so that an entry will be created in the auto-generated auth-to-local rule sets. By editing the data on this page and saving it, I believe the configurations will be updated and you may have to restart some services. If not, you should click on the "Regenerate Keytabs" button and the configurations will be updated along with the new keytab files.
... View more
06-29-2016
06:06 PM
1 Kudo
@Chris Nauroth
The hadoop kerbname tool is awesome! I just had a opportunity to use it to test out a complex rule, but I needed to use the _old_ syntax to get it to work (thanks for referencing your comment that explains the usage): hadoop org.apache.hadoop.security.HadoopKerberosName 123456789@EXAMPLE.COM I meant to add this to the doc, I just haven't had the time. I will get around to it though. But thanks for proving the tool.
... View more
06-24-2016
06:23 PM
Thanks for the update. I am surprised that WebHCat was the only issue. I see in the scripts why that is happening, but I would imagine other services might follow the same pattern. However, maybe this is a bug in WebHCat and someone might need to investigate why the script does this.
... View more
06-24-2016
02:29 PM
@Benjamin R That seems reasonable to me. I would be interested to hear if there are any ill effects by doing that. That would help with my proposal for the registry I mentioned.
... View more
06-24-2016
01:35 PM
2 Kudos
@Benjamin R
I assume that reason the HDFS headless keytab file's permission 440 rather than 400 is to allow components (what execute under that same group) to use the keytab file in order to create their specific filesystem structures in HDFS during install. However, I am not totally sure. When adding the feature of enabling Kerberos to Ambari, we intended to ensure similarity with older versions of Ambari where the process was manual - see Ambari 1.7.0 Security Guide - Creating Service Principals and Keytab Files for Hadoop 2.x. As part of that process, the HDFS headless keytab file was chmod-ed to 440. At some point we want to investigate limiting the exposure of this keytab file (further than we have) by providing a registry so that services can declare what filesystem structure they need in HDFS. Then the HDFS service can create it on behalf of the service. This should allow for the HDFS headless keytab file to be distributed to only the hosts where a NameNode is installed and in turn the permission on the file can be set to 400.
... View more
06-17-2016
01:15 PM
@Philippe Back... From all krb5.conf files on all nodes in the hadoop cluster.
... View more
06-14-2016
03:58 PM
3 Kudos
@Pranay Vyas, When enabling Kerberos, Ambari set to integrated with an MIT KDC, Active Directory, and soon FreeIPA. This setting allowing Ambari to interact with the specific KDC as needed. In the case of Active Directory, Ambari uses the Active Directory's LDAP interface, via the LDAPS protocol. During the enable Kerberos workflow, the user needs to supply details about this interface (LDAPS URL, container DN, and administrative credentials). Ambari can also be configured to set certain properties on the accounts it creates while enabling Kerberos. Note that the protocol MUST be LDAPS since Active Directory requires a secure connection in order for a password to be set or updated on an account in the domain. As part of this process, Ambari will internally create and distribute the keytab files that are needed. This can be done because Ambari generates and temporarily holds on to the passwords for each account it creates in the Active Directory. Once the process is complete, the passwords are lost and cannot be retrieved. However the keytab files will exist and be distributed, so the passwords are not needed.
... View more
06-13-2016
02:56 PM
4 Kudos
@Rob Ketcherside, You should be able to do what you are currently doing, but you will also need to add kerberos descriptor entries for the custom service. See https://cwiki.apache.org/confluence/display/AMBARI/Automated+Kerberizaton#AutomatedKerberizaton-DescriptorSpecifications for information on creating this entry. As as quick start example, you will want to add a block under the Kerberos descriptor services section (in the blueprint) for your service... {
...,
"services" : {
...,
{
"name" : "OneFS",
"identities": [
...
],
"components": [
...
],
...
},
...
},
...
}
... View more
06-03-2016
10:28 AM
@Blanca Sanz... Ambari does not care where the KDC lives. It is an external service like the LDAP server, so feel free to install it on the Ambari sever host, and host in the cluster, or any host not in the cluster but accessible via the network to the Ambari server host and the hosts in the cluster. The "Automated Kerberos Installation and Configuration" article you are referring to walks through a scenario where a script performs all of the tasks for you. I don't think it is really a guideline - it is just a quick start process if that particular scenario suits your needs. However, if you choose to do so, using that script is not a bad idea. If you choose to not use that script.. Once the KDC is installed and configured, you just need to run through the "Enable Kerberos Wizard" in Ambari. It asks you to fill in a few details about your KDC and then it does the rest of the work - this is not very different than how the script works except you will need to click a "Next" button every so often. The hard part is installing and configuring your KDC, which isn't all that hard but may become a bit more complicated if you want to integrate it with you OpenLDAP server. I am sorry that I do not have a definite answer for you, but essentially the KDC configuration is specific to your infrastructure and needs and not really a one-size-fits-all thing. Though in the simple case, it can be - and that is what that "Automated Kerberos Installation and Configuration" article describes.
... View more
06-01-2016
02:45 PM
Ambari integrates with a KDC and an LDAP server separately. The KDC integration point is used to manage Kerberos Identities when enabling and disabling Kerberos. The LDAP integration point is used for authentication to use Ambari itself and its views. In either case the KDC and LDAP server can be the same server (Active Directory, for example) or different servers (MIT KDC and OpenLDAP, for example). In your case, with the separate servers, there is no need to integrate the two for Ambari. However, if you have other reasons to integrate them, I believe that there are ways to do this. I am not too familiar with setting this up. You might want to take at the MIT KDC documentation like http://web.mit.edu/Kerberos/krb5-1.13/doc/admin/conf_ldap.html.
... View more