Support Questions

Find answers, ask questions, and share your expertise

HIVE services check failing

avatar
Super Collaborator

I am trying to setup HIVE view for ambari users but its not working its not giving any specific error also but this is what i see . I am not using Kerberos. I have the /user/admin directory and proxy parameters set.

$ hdfs dfs -ls /user/Found 13 items

drwxr-xr-x  - admin  hadoop  0 2018-08-20 17:27 /user/admin
hadoop.proxyuser.root.groups=*hadoop.proxyuser.root.hosts=*

Issues detected

Service 'userhome' check failed: hadoop1.tolls.dot.state.fl.us:50070: Unexpected HTTP response: code=504 != 200, op=GETFILESTATUS, message=Gateway Timeout

92535-capture.jpg

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Sami Ahmad

As we basically see a "504" error here which is basically a Proxy Gateway Error hence please check if you have enabled any Http Proxy / Network proxy at your end?

I am suspecting that the WebHDFS reuests originated by the Hive View is actially passing through some Http Proxy configured on your cluster. You may need to either make the request bypass the proxy server or make the proxy work.

So please check the following:

1. Check the "environment" setting to find out if there is any Http Proxy added? (look for 'proxy')

# /var/lib/ambari-agent/ambari-sudo.sh su hdfs -l -s /bin/bash -c 'env'  


2. See if you are able to make the WebHDFS call via terminal from ambari server host? And to see the output of the request is being passed via proxy?

# curl -ivL -X GET "http://$ACTIVE_NAME_NODE:50070/webhdfs/v1/user/admin?op=GETHOMEDIRECTORY&user.name=admin"


3. You can also refer to the following doc to know how to enable Http Proxy settings inside Ambari Server (and you can also configure ambari JVM property to exclude your cluster nodes requests to NOT be passed via proxy) See: https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.2.0/bk_ambari-administration/content/ch_setting...

-Dhttp.nonProxyHosts=<pipe|separated|list|of|hosts>


4. Or you can also configure "no_proxy" at the "~/.bash_profile" OR "/etc/profile" level globally to make suere that your internal cluster requests are not passed vias Proxy.

no_proxy=".example.com"
export no_proxy

.

View solution in original post

4 REPLIES 4

avatar
Super Collaborator

Ambari server log showing the following:

24 Sep 2018 17:35:20,856 ERROR [ambari-client-thread-199385] WebHdfsFileSystem:402 - Unable to get HomeDirectory from original File System
java.io.IOException: hadoop1.tolls.dot.state.fl.us:50070: Unexpected HTTP response: code=504 != 200, op=GETHOMEDIRECTORY, message=Gateway Timeout

if i press the back arrow button it does show me the hive command window but if i give any command there i get this error :

92536-capture.jpg

avatar
Master Mentor

@Sami Ahmad

As we basically see a "504" error here which is basically a Proxy Gateway Error hence please check if you have enabled any Http Proxy / Network proxy at your end?

I am suspecting that the WebHDFS reuests originated by the Hive View is actially passing through some Http Proxy configured on your cluster. You may need to either make the request bypass the proxy server or make the proxy work.

So please check the following:

1. Check the "environment" setting to find out if there is any Http Proxy added? (look for 'proxy')

# /var/lib/ambari-agent/ambari-sudo.sh su hdfs -l -s /bin/bash -c 'env'  


2. See if you are able to make the WebHDFS call via terminal from ambari server host? And to see the output of the request is being passed via proxy?

# curl -ivL -X GET "http://$ACTIVE_NAME_NODE:50070/webhdfs/v1/user/admin?op=GETHOMEDIRECTORY&user.name=admin"


3. You can also refer to the following doc to know how to enable Http Proxy settings inside Ambari Server (and you can also configure ambari JVM property to exclude your cluster nodes requests to NOT be passed via proxy) See: https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.2.0/bk_ambari-administration/content/ch_setting...

-Dhttp.nonProxyHosts=<pipe|separated|list|of|hosts>


4. Or you can also configure "no_proxy" at the "~/.bash_profile" OR "/etc/profile" level globally to make suere that your internal cluster requests are not passed vias Proxy.

no_proxy=".example.com"
export no_proxy

.

avatar
Super Collaborator

yes i am behind proxy but only for outgoing calls , hive view should not try to go outside my cluster right?

Ambari is elready enabled for proxy and working fine. All the modules like hive, hbase, curl, yum, etc etc working fine so why would i have just issue with ambari views? none of them work .

avatar
Super Collaborator

you were absolutely right .. . it was the proxy settings causing . I removed the proxy settings from ambari-server and bounced it and all views working now .

I tried to append the -Dhttp.nonProxyHosts=hadoop1|hadoop2|hadoop3 to the proxy settings of ambari \but it didn't like it .. can you give me right syntax please ?