Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 609 | 06-04-2025 11:36 PM | |
| 1175 | 03-23-2025 05:23 AM | |
| 579 | 03-17-2025 10:18 AM | |
| 2183 | 03-05-2025 01:34 PM | |
| 1373 | 03-03-2025 01:09 PM |
09-04-2017
09:13 AM
@Kishore Kumar Can you do the following as user root # su - hdfs Now as user hdfs $ hdfs dfsadmin -safemode leave Then restart through Ambari
... View more
09-04-2017
08:55 AM
@@Ashnee Sharma Can you check the value for hive.server2.enable.doAs in Hive-->Configs-->settings and filter hive.server2.enable.doAs Do you have a Ranger default policy configured for hive user ?
... View more
09-04-2017
08:31 AM
@Rajendra Manjunath Can you give details of the cluster like HDP version ? Kerberized or NOT? Sandbox or bare metal? Can you assure that HDFS is started in Ambari UI and HDFS -->Configs-->Advanced -->WebHDFS enabled What values do you have HDFS -->Configs-->Advanced --> Advanced core-site fs.defaultFS What values do you have for the below it should be * HDFS -->Configs-->Advanced -->Custom core-site hadoop.proxyuser.hdfs.groups
hadoop.proxyuser.hdfs.hosts What values do you have in HDFS -->Configs-->Advanced --> Custom core-site and filter hadoop.proxyuser.ambari-server hadoop.proxyuser.ambari-server-xxxx.groups
hadoop.proxyuser.ambari-server-xxxx.hosts Please let me know
... View more
09-02-2017
04:16 AM
@btandel Have you tried running the import-hive.sh? /usr/hdp/current/atlas-server/hook-bin/import-hive.sh See docmentation
... View more
09-01-2017
09:05 PM
@Sam Red The "Service 'userhome' check failed" is easy to resolve you need to create the home directory in # su - hdfs
$ hdfs dfs -mkdir /user/hive
$ hdfs dfs -chown hive:hdfs /user/hive Then retry the hive view. That should work, let me know.
... View more
09-01-2017
08:35 PM
@Sam Red I have also been trying to understand what is wrong. What's this command's output? # klist -kt /etc/security/keytabs/ambari.server.keytab
keytab name: FILE:/etc/security/keytabs/ambari.server.keytab
KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM
1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM
1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM
1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM
1 08/24/2017 15:42:24 ambari-server-abc_bigxxxline@ROMAT.COM Then grab a valid Kerberos ticket $ kinit -kt /etc/security/keytabs/ambari.server.keytab ambari-server-abc_bigxxxline@ROMAT.COM Then try accessing then retry.
... View more
09-01-2017
05:04 PM
@Vicente Ciampa There are a couple of things to check. If all applications are failingg there is a serious problem with your Kerberos setup. To help troubleshoot, can you obscure the sensitive info in the below files and attach them /etc/krb5.conf
/var/kerberos/krb5kdc/kdc.conf
/var/kerberos/krb5kdc/kadm5.acl
solr_jaas.conf Did the kerberization through Ambari UI succeed? As root can you run on the KDC server the below command and see if the solr principal was created # kadmin.local
Authenticating as principal root/admin@REALM with password.
kadmin.local: listprincs Revert
... View more
09-01-2017
11:25 AM
@Nilambari Ahuja There you go , the hanging was due to memory issues.
... View more
09-01-2017
08:37 AM
@Nilambari Ahuja Sorry to announce the bad news, HDP sandbox needs minimum 12GB memory. (8GB for HDP and 4 GB for the OS) so to successfully install HDP you should have a12GB ->16GB for great performance. Hope that helps.
... View more
09-01-2017
08:20 AM
@Nilambari Ahuja What exactly are you experiencing ?
... View more