Support Questions

Find answers, ask questions, and share your expertise

[KERBEROS] Failed to kinit as the KDC administrator user, admin/admin@HADOOP.COM

avatar
Rising Star

Hallo,

When to enable Kerberos via ambari, I am facing the following window popup at the time of Testing client after client installation saying

rizalt_0-1717483147659.png

in my log ambari-server listed below

2024-06-04 06:27:43,380  WARN [agent-report-processor-2] ActionManager:162 - The task 76 is not in progress, ignoring update
2024-06-04 06:27:43,861  INFO [ambari-client-thread-6248] AmbariManagementControllerImpl:4086 - Received action execution request, clusterName=hadoop, request=isCommand :true, action :null, command :KERBEROS_SERVICE_CHECK, inputs :{HAS_RESOURCE_FILTERS=true}, resourceFilters: [RequestResourceFilter{serviceName='KERBEROS', componentName='null', hostNames=[]}], exclusive: false, clusterName :hadoop
2024-06-04 06:27:44,149  WARN [ambari-client-thread-6248] KDCKerberosOperationHandler:329 - Failed to kinit as the KDC administrator user, admin/admin@HADOOP.COM:
	ExitCode: 1
	STDOUT: 
	STDERR: kinit: Server not found in Kerberos database while getting initial credentials

2024-06-04 06:27:44,151 ERROR [ambari-client-thread-6248] KerberosHelperImpl:2507 - Cannot validate credentials: org.apache.ambari.server.serveraction.kerberos.KerberosAdminAuthenticationException: Invalid KDC administrator credentials.
The KDC administrator credentials must be set as a persisted or temporary credential resource.This may be done by issuing a POST (or PUT for updating) to the /api/v1/clusters/:clusterName/credentials/kdc.admin.credential API entry point with the following payload:
{
  "Credential" : {
    "principal" : "(PRINCIPAL)", "key" : "(PASSWORD)", "type" : "(persisted|temporary)"}
  }
}
2024-06-04 06:27:44,152 ERROR [ambari-client-thread-6248] CreateHandler:80 - Bad request received: Invalid KDC administrator credentials.
The KDC administrator credentials must be set as a persisted or temporary credential resource.This may be done by issuing a POST (or PUT for updating) to the /api/v1/clusters/:clusterName/credentials/kdc.admin.credential API entry point with the following payload:
{
  "Credential" : {
    "principal" : "(PRINCIPAL)", "key" : "(PASSWORD)", "type" : "(persisted|temporary)"}
  }
}
2024-06-04 06:27:44,578  WARN [agent-report-processor-1] ActionManager:162 - The task 75 is not in progress, ignoring update

can anyone help me, please..

 

13 REPLIES 13

avatar
Rising Star

@Shelton 

I'm following your step, but show an error like below

 

root@master1:~# sudo systemctl restart krb5-kdc
Job for krb5-kdc.service failed because the control process exited with error code.
See "systemctl status krb5-kdc.service" and "journalctl -xeu krb5-kdc.service" for details.
root@master1:~# systemctl status krb5-kdc.service
× krb5-kdc.service - Kerberos 5 Key Distribution Center
     Loaded: loaded (/lib/systemd/system/krb5-kdc.service; enabled; vendor preset: enabled)
     Active: failed (Result: exit-code) since Fri 2024-06-07 00:33:16 UTC; 5min ago
    Process: 13894 ExecStart=/usr/sbin/krb5kdc -P /var/run/krb5-kdc.pid $DAEMON_ARGS (code=exited, status=1/FAILURE)
        CPU: 92ms

Jun 07 00:33:16 master1.hadoop.com systemd[1]: Starting Kerberos 5 Key Distribution Center...
Jun 07 00:33:16 master1.hadoop.com krb5kdc[13894]: Couldn't open log file /var/log/krb5kdc.log: Read-only file system
Jun 07 00:33:16 master1.hadoop.com krb5kdc[13894]: krb5kdc: Configuration file does not specify default realm, attempt>
Jun 07 00:33:16 master1.hadoop.com krb5kdc[13894]: Configuration file does not specify default realm - while attemptin>
Jun 07 00:33:16 master1.hadoop.com systemd[1]: krb5-kdc.service: Control process exited, code=exited, status=1/FAILURE
Jun 07 00:33:16 master1.hadoop.com systemd[1]: krb5-kdc.service: Failed with result 'exit-code'.
Jun 07 00:33:16 master1.hadoop.com systemd[1]: Failed to start Kerberos 5 Key Distribution Center.

 

 

avatar
Rising Star

@Shelton @Majeti 

I found in the kdf.conf for "admin_keytab" path /etc/krb5kdc/kadm5.keytab not found, where i can create kadm5.keyab? please see below

rizalt_0-1717731855188.png

any suggestions?

 

avatar
Master Mentor

@rizalt 
Can you share the OS ,OS version and HDP version you are trying to Kerberize? I don't have a dump of HDP binaries though. I would like to reproduce and share the steps.?

I suggest starting afresh so delete/destroy the current KDC as the root user or sudo the following steps are specific to  ubuntu  re-adapt for appropriate OS

# sudo  kdb5_util -r HADOOP.COM destroy

Accept with a "Yes"

Now create a new Kerberos database

Complete remove Kerberos

$ sudo apt purge -y krb5-kdc krb5-admin-server krb5-config krb5-locales krb5-user krb5.conf 
$ sudo rm -rf /var/lib/krb5kdc

Do a refresh installation

First, get the FQDN of your kdc server for this example

# hostanme -f 
test.hadoop.com

Use the above output for a later set up

# apt install krb5-kdc krb5-admin-server krb5-config

Proceed as follow

At the prompt for the Kerberos Realm = HADOOP.COM
Kerberos server hostname = test.hadoop.com
Administrative server for Kerberos REALM = test.hadoop.com

Configuring krb5 Admin Server

# krb5_newrealm

Open /etc/krb5kdc/kadm5.acl it should contain a line like this

*/admin@HADOOP.COM *

The kdc.conf should be adjusted to look like this

[kdcdefaults]
 kdc_ports = 88
 kdc_tcp_ports = 88

[realms]
 HADOOP.COM = {
  #master_key_type = aes256-cts
  acl_file = /var/kerberos/krb5kdc/kadm5.acl
  dict_file = /usr/share/dict/words
  admin_keytab = /var/kerberos/krb5kdc/kadm5.keytab
  supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal camellia256-cts:normal camellia128-cts:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
}

 

The krb5.conf should look like this if you are on a multi-node cluster this is the fines you will copy to all other hosts, notice the entry under domain_realm?

[libdefaults]
  renew_lifetime = 7d
  forwardable = true
  default_realm = HADOOP.COM
  ticket_lifetime = 24h
  dns_lookup_realm = false
  dns_lookup_kdc = false
  default_ccache_name = /tmp/krb5cc_%{uid}
  #default_tgs_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5
  #default_tkt_enctypes = aes des3-cbc-sha1 rc4 des-cbc-md5

[domain_realm]
  .hadoop.com = HADOOP.COM
  hadoop.com = HADOOP.COM

[logging]
  default = FILE:/var/log/krb5kdc.log
  admin_server = FILE:/var/log/kadmind.log
  kdc = FILE:/var/log/krb5kdc.log

[realms]
  HADOOP.COM = {
    admin_server = test.hadoop.com
    kdc = test.hadoop.com
  }

 Restart the Kerberos kdc daemons and kerberos admin servers:

# for script in /etc/init.d/krb5*; do $script restart; done

 Don't manually create any principle like the "ambari_hdfs-050819@HADOOP.COM"

Go to the ambari kerberos wizard for the domain notice the . (dot) 

kdc host = test.hadoop.com
Real Name = HADOOP.COM
Domains = .hadoop.com ,hadoop.com
-----
kadmin host = test.hadoop.com
Admin principal = admin/admin@HADOOP.COM
Admin  password = password set during the creation of kdc database

Now from here just accept the default the keytabs should generate successfully. 

avatar
Rising Star