Created on 03-09-2017 09:29 AM - edited 08-18-2019 06:12 AM
Hello,
I installed kerberos in my cluster. And now hive service check failed.
i have this error :
How can i resolved that?
Created 03-09-2017 10:12 AM
Hive service is alive but ckeck failed!
Created 03-09-2017 12:15 PM
pls go through the Jira which got fixed https://issues.apache.org/jira/browse/AMBARI-18157
This was fixed in ambari 2.4 patches.
Created 03-09-2017 01:48 PM
hello @Dileep Kumar Chiguruvada
The jira not indicate how can i resolve the problem!
I use Ambari 2.4.2.
Created 03-09-2017 07:16 PM
If you are receiving unauthorized connection error as mentioned in the JIRA then you can try the below option. It might help you to resolve this issue.
{"error":"Unauthorized connection for super-user: HTTP/HOST_NAME@REALM_NAME from IP HOST_IP"}http_code <500>
1. Identify and note the node where WebHCat Server runs 2. Using Ambari HDFS-Configs, check if hadoop.proxyuser.HTTP.hosts is defined in core-site section 3. If exists, update the parameter to include the WebHCat node name 4. If not, add the parameter and include WebHCat node name to the same 5. Restart all the services that needs restart including Hive 6. Run Hive service check again
Created 03-14-2017 06:38 AM
In Custom core-site I see that the host which hosts WebHCat server is already included and the same issue is replicated.
@Shyam Shaw can you help me..