Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 553 | 06-04-2025 11:36 PM | |
| 1100 | 03-23-2025 05:23 AM | |
| 561 | 03-17-2025 10:18 AM | |
| 2104 | 03-05-2025 01:34 PM | |
| 1317 | 03-03-2025 01:09 PM |
10-14-2019
10:51 AM
@saivenkatg55 What do you mean by local (MIT) if I guess right you are accessing the HDP cluster from a client laptop or edge node where you installed the Kerberos client libraries. To communicate with secure Hadoop clusters that use Kerberos authentication, known as Kerberized clusters, a client uses the Kerberos client utilities. You MUST install these utilities on the same system where you are connecting from. For Linux desktops here are the different options Ubunt: # apt install krb5-user RHEL/Centos: # yum install -y krb5-server krb5-libs krb5-workstation These packages deliver the krb5.conf that the client should configure to connect to a kerberized cluster, the easier and recommended way is to copy the krb5.conf from the kdc server to all clients that need to connect to the Kerberized cluster in RHEL/CentOS it's located in /etc/krb5.conf. This file has the pointer to the REALM, KDC and ADMIN server Here is an example [logging] default = FILE:/var/log/krb5libs.log kdc = FILE:/var/log/krb5kdc.log admin_server = FILE:/var/log/kadmind.log [libdefaults] default_realm = REDHAT.COM dns_lookup_realm = false dns_lookup_kdc = false ticket_lifetime = 24h renew_lifetime = 7d forwardable = true [realms] REDHAT.COM = { kdc = KDC.REDHAT.COM admin_server = KDC.REDHAT.COM } [domain_realm] .redhat.com = REDHAT.COM redhat.com = REDHAT.COM Else share /var/log/kadmind.log and /var/log/kadmind.log HTH
... View more
10-10-2019
11:53 PM
@irfangk1 That means you didn't install MySQL. Can I ask you one question? What documentation are you using as a reference?The basic steps are described when you are preparing the environment. To me it seems you launched the deployment using the embedded database Postgresql but now you want to switch to MySQL. To do that you need to install mysql and pre-create the ambari database before launching the Ambari server setup, that explains why you are getting all those errors. I can send you a procedure in 9 hours
... View more
10-10-2019
09:19 PM
@irfangk1 Can you share the screenshot of your Ambari Hive config because when I see "jdbc:mysql://host.com/metastore" "hive" that means you dont have not The default TCP port for PostgreSQL is usually 5432 see in the postgresql.conf
... View more
10-10-2019
01:18 PM
@irfangk1 If you switched to Mysql why do you have these entries in your ambari.properties server.jdbc.database=postgres server.jdbc.database_name=ambari server.jdbc.postgres.schema=ambari Can you share the output of # cat /etc/ambari-server/conf/ambari.properties | grep mysql Why is that in the log again I see "jdbc:mysql://host.com/metastore" "hive" where is the PORT number?
... View more
10-10-2019
01:01 PM
@irfangk1 This line in your error log says it all org.apache.ambari.server.DBConnectionVerification "jdbc:mysql://host.com/metastore" "hive" The format should be jdbc:mysql://host-name: port usually for Mysql it's 3306 to initiate any database connectivity you need to know the port at which the database is waiting for connections Ambari.properties # cat /etc/ambari-server/conf/ambari.properties | grep mysql custom.mysql.jdbc.name=mysql-connector-java.jar previous.custom.mysql.jdbc.name=mysql-connector-java.jar server.jdbc.database=mysql server.jdbc.driver=com.mysql.jdbc.Driver server.jdbc.rca.driver=com.mysql.jdbc.Driver server.jdbc.rca.url=jdbc:mysql://your_host:3306/ambari server.jdbc.url=jdbc:mysql://your_host:3306/ambari Backup these files Verification jar # mv /var/lib/ambari-agent/cache/DBConnectionVerification.jar /var/lib/ambari-agent/cache/DBConnectionVerification.jar.bck MySql connector # mv /var/lib/ambari-agent/cache/mysql-connector-java.jar /var/lib/ambari-agent/cache/mysql-connector-java.jar.bck Then restart Ambari and the agent
... View more
10-10-2019
10:24 AM
@vsrikanth9 Great, it worked but you should recognize even if you had modified the /var/kerberos/krb5kdc/kadm5.acl still krb5.conf was wrong and your Ambari UI was wrong so you still wouldn't have resolved it 🙂 Happy hadooping
... View more
10-10-2019
07:19 AM
@vsrikanth9 Not exactly now the REALM part was wrong again the rest are okay you substituted the wrong values here is how it's supposed to be you see the highlighted part Configuration snippets may be placed in this directory as well includedir /etc/krb5.conf.d/ [logging] default = FILE:/var/log/krb5libs.log kdc = FILE:/var/log/krb5kdc.log admin_server = FILE:/var/log/kadmind.log [libdefaults] dns_lookup_realm = false ticket_lifetime = 24h renew_lifetime = 7d forwardable = true rdns = false pkinit_anchors = /etc/pki/tls/certs/ca-bundle.crt default_realm = HADOOPSECURITY.COM default_ccache_name = KEYRING:persistent:%{uid} [realms] HADOOPSECURITY.COM = { kdc = p1.bigdata.com admin_server = p1.bigdata.com } [domain_realm] .hadoopsecurity.com = HADOOPSECURITY.COM hadoopsecurity.com = HADOOPSECURITY.COM Do that and let me know the KDC and Admin server are usually the same 🙂
... View more
10-10-2019
01:22 AM
@irfangk1 There are a couple of things to do first check your disk space and inode usage. To rule out permissions, there a are the listings for the relevant directory: $ ls -la /var/lib/mysql Can you share /var/log/mysql/mysql.log What the value of innodb-buffer-pool size in the config file /etc/mysql/my.cnf can you edit the my.cnf by adding [mysqld] innodb_force_recovery = 1 And then running: sudo systemctl start mysql Hope that helps
... View more
10-09-2019
11:11 PM
@vsrikanth9 1.Your KDC part of the screenshot has an error 🙂 in the domains part just copy and paste the below as is to replace p1.bigdata.com noe the dot(.) and comma separating the names .hadoopsecurity.com,hadoopsecurity.com The validation passed because in reality it only test the connectivity ONLY to the KDC server 2. And then the Kadmin part the Admin principal should be the output of your # kadmin.local Something like admin/admin@hadoopsecurity.com or root/admin@hadoopsecurity.com What ever you chose during the installation of Kerberos after that then launch the recreation of the keytabs and all should be okay. Make sure the KDC server is up and running during this process. Please revert
... View more