Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 606 | 06-04-2025 11:36 PM | |
| 1152 | 03-23-2025 05:23 AM | |
| 573 | 03-17-2025 10:18 AM | |
| 2169 | 03-05-2025 01:34 PM | |
| 1365 | 03-03-2025 01:09 PM |
08-14-2017
09:15 AM
@Fahad Sarwar Can you attach the ambari-server log.
... View more
08-14-2017
08:05 AM
4 Kudos
@naveen sangam After creating the KDC databases do the following. While logged on the KDC server kdc.dev.dataquest.com as root on this example is on Centos7 ## Check the Principals yours should look like this # sudo kadmin.localAuthenticating as principal root/admin@DEV.DATAQUEST.COM with password.
kadmin.local: listprincs
K/M@DEV.DATAQUEST.COM
kadmin/admin@DEV.DATAQUEST.COM
kadmin/changepw@DEV.DATAQUEST.COM
kadmin/ kdc.dev.dataquest.com@DEV.DATAQUEST.COM
kiprop/ kdc.dev.dataquest.com@DEV.DATAQUEST.COM
krbtgt/DEV.DATAQUEST.COM@DEV.DATAQUEST.COM
kadmin.local: You MUST create a root principal for kerberization kadmin.local: addprinc root/admin
WARNING: no policy specified for root/admin@UPUTEST.CH; defaulting to no policy
Enter password for principal "root/admin@UPUTEST.CH": {KDC_password}
Re-enter password for principal "root/admin@DEV.DATAQUEST.COM": {KDC_password}
Principal "root/admin@DEV.DATAQUEST.COM" created. And this is the admin you will use in the Ambari UI kerberizaton tool root/admin@DEV.DATAQUEST.COM
password {KDC_password}
... View more
08-11-2017
11:59 AM
@Elton Freitas Have a look at this solution
... View more
08-11-2017
11:18 AM
@Chiranjeevi Nimmala Make the below changes in hive and restart all stale configuration webhcat.proxyuser.root.groups *
webhcat.proxyuser.root.hosts * See Ambari views documentation Let me know if that helped
... View more
08-10-2017
06:40 PM
@Thanuja Kularathna What are your memory settings for hbase? they seem too low. Can you give the settings for Ambari UI--->Hbase--->Configs--->settings- HBase RegionServer
HBase Master Max.Server Cheers
... View more
08-10-2017
04:54 PM
@John Wright I am now almost sure your local repository wasn't well configured,initially your hdp.repo had the below content baseurl=http://private-repo-1.hortonworks.com/HDP/centos7-ppc/2.x/updates/2.6.0.0-598
baseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ppc64le With a local repository your baseurl should have been like either of the below http://FQDN_local_webserver/HDP/xxxxx
http://Local_IP/HDP/xxxxx Both URL's should be accessible to the ambari server Please have a look at this Hortonworks document ,once your internal repo is well configure and firewall rules between the nodes configured you should be able to run the Ambari cluster setup successfully
... View more
08-10-2017
01:49 PM
1 Kudo
@salma zegdene The /boot/efi system partition is a boot partition why do you have the namenode under that? Can you go to Ambari --->HDFS--->Configs and change the Namenode directory to something else other than /boot/efi/hadoop/hdfs/namenode for example /hadoop/hdfs/namenode And restart the namenode that should resolve the problem Please let me know
... View more
08-10-2017
01:15 PM
@Vishal Gupta Experience is the best teacher, its always good to follow the official documentation I have never failed to to kerberize because I stick to the document. The setup of the KDC and KDC clients is key to successfully kerberize and unkerberize a HDP cluster. Remember to document always 🙂 If my helped you then,you can accept and close this thread.
... View more
08-10-2017
01:14 PM
@Vishal Gupta Experience is the best teacher, its always good to follow the official documentation I have never failed to to kerberize because I stick to the document. The setup of the KDC and KDC clients is key to successfully kerberize and unkerberize a HDP cluster. Remember to document always 🙂 If my helped you then,you can accept and close this thread.
... View more
08-10-2017
01:09 PM
@sachin gupta Have you created a keytab for the USER_1 check in $ ls -al /etc/security/keytabs Did USER_1 grab a valide kerberos ticket as the USER_1 run $ klist Do yo have any output? USER_1 might have been blacklisted by the Ambari property, hadoop.kms.blacklist.DECRYPT_EEK. Thats the most probable reason why you are unable to decrypt being an 'USER_1' user.
Did you give read permission to USER_1 to that encryption zone?
... View more