Support Questions

Find answers, ask questions, and share your expertise

error loading DBConnectionVerification.jar

avatar
Super Collaborator

I know I have a proxy issue some where but in order to find the issue I need to understand the flow. how does this jar file gets downloaded to a node from ambari server ? The log for testing the "hive" module from ambari console ,is not giving any detail and just that its couldn't download this jar file.

its some kind of localized proxy issue since other operations worked fine so far .

MYSQL connection error 

2017-06-16 21:27:10,803 - Error downloading DBConnectionVerification.jar from Ambari Server resources. Check network access to Ambari Server.
HTTP Error 504: Gateway Timeout
1 ACCEPTED SOLUTION

avatar
Master Mentor

@Sami Ahmad - If we try to understand what is happening behind the error:

2017-06-16 21:27:10,803 - Error downloading DBConnectionVerification.jar
 from Ambari Server resources. Check network access to Ambari Server.
HTTP Error 504: Gateway Timeout

- Ambari uses the following JAR "/var/lib/ambari-server/resources/DBConnectionVerification.jar" to perform various DB connectivity tests.

- This JAR need to be present in all the Ambari Agent Hosts. So Ambari Agents uses the following URL to download the mentioned JAR from ambari server host and then puts it inside their "/var/lib/ambari-agent/tmp/DBConnectionVerification.jar" location.

http://$AMBARI_SERVERHOST:8080/resources/DBConnectionVerification.jar

- Now if due to some reason the agents are not able to download this JAR from ambari server host over HTTP then the clients (agents) will fail to perform the DB connection check.

- In your case the Agents are not able to download this JAR from ambari server host because of the following error:

HTTP Error 504: Gateway Timeout

- This indicates that from the agent machines the following URL access is failing (using wget here just for demo, agents will use the python approach to download this jar):

wget http://$AMBARI_SERVERHOST:8080/resources/DBConnectionVerification.jar

What is the root cause: - Putting this JAR manually from Ambari Server host to Agents machines can make it work temporarily, but it will not fix the permanent issue. We should findout why the Agents are not able to download this jar from ambari server host. Agents should be able to access ambari server resources using HTTP.

.

View solution in original post

13 REPLIES 13

avatar
Super Collaborator

ambari server log on the other hand showing a very different error when I hit the "Test connection" button .

16 Jun 2017 21:35:22,149  WARN [ambari-action-scheduler] ExecutionCommandWrapper:185 - Unable to lookup the cluster byt ID; assuming that there is no cluster and therefore no configs for this execution command: Cluster not found, clusterName=clusterID=-1
16 Jun 2017 21:35:23,153  WARN [ambari-action-scheduler] ExecutionCommandWrapper:185 - Unable to lookup the cluster byt ID; assuming that there is no cluster and therefore no configs for this execution command: Cluster not found, clusterName=clusterID=-1
16 Jun 2017 21:35:24,156  WARN [ambari-action-scheduler] ExecutionCommandWrapper:185 - Unable to lookup the cluster byt ID; assuming that there is no cluster and therefore no configs for this execution command: Cluster not found, clusterName=clusterID=-1
16 Jun 2017 21:35:25,158  WARN [ambari-action-scheduler] ExecutionCommandWrapper:185 - Unable to lookup the cluster byt ID; assuming that there is no cluster and therefore no configs for this execution command: Cluster not found, clusterName=clusterID=-1
16 Jun 2017 21:35:26,161  WARN [ambari-action-scheduler] ExecutionCommandWrapper:185 - Unable to lookup the cluster byt ID; assuming that there is no cluster and therefore no configs for this execution command: Cluster not found, clusterName=clusterID=-1
16

avatar
Super Collaborator

looking at the similar issues online I have already checked that

1- my ntpd service is running on all nodes.

2- the ambari agent and ambari server of same version on all nodes

[root@hadoop1 ambari-server]# rpm -qa | grep ambari ambari-metrics-grafana-2.4.3.0-30.x86_64 ambari-agent-2.4.3.0-30.x86_64 ambari-server-2.4.3.0-30.x86_64

avatar
Expert Contributor

please try following , it may be help full

1) download the mysql connector jar and keep in this location /usr/share/java/mysql-connector-java.jar.

2) Mysql client should be installed on Ambari-server, if their only you installed the MYsql not required.

3) Verify MySQL connector working fine, like username and password, from the Ambari-server , please check as below.

Crosschecking the mysql connector working fine or not on ambari-server

[root@]#/usr/lib/hive/bin/schematool -initSchema -dbType mysql -userName hive -passWord hive
Metastore connection URL:jdbc:mysql://centos2.test.com/hive?createDatabaseIfNotExist true
Metastore Connection Driver :    com.mysql.jdbc.Driver
Metastore connection User:       hive
Starting metastore schema initialization to 0.13.0
Initialization script
hive-schema-0.13.0.mysql.sql
Initialization script completed
schemaTool completeted
[root@centos ~]#

avatar
Super Collaborator

my hive environment is on hadoop2 , the ambari server is on hadoop1, you want me to move mysqld to hadoop1 ? in that case schematool will fail on hadoop2 when run I think

I am not understanding I think , please clarify

avatar
Master Mentor

@Sami Ahmad - If we try to understand what is happening behind the error:

2017-06-16 21:27:10,803 - Error downloading DBConnectionVerification.jar
 from Ambari Server resources. Check network access to Ambari Server.
HTTP Error 504: Gateway Timeout

- Ambari uses the following JAR "/var/lib/ambari-server/resources/DBConnectionVerification.jar" to perform various DB connectivity tests.

- This JAR need to be present in all the Ambari Agent Hosts. So Ambari Agents uses the following URL to download the mentioned JAR from ambari server host and then puts it inside their "/var/lib/ambari-agent/tmp/DBConnectionVerification.jar" location.

http://$AMBARI_SERVERHOST:8080/resources/DBConnectionVerification.jar

- Now if due to some reason the agents are not able to download this JAR from ambari server host over HTTP then the clients (agents) will fail to perform the DB connection check.

- In your case the Agents are not able to download this JAR from ambari server host because of the following error:

HTTP Error 504: Gateway Timeout

- This indicates that from the agent machines the following URL access is failing (using wget here just for demo, agents will use the python approach to download this jar):

wget http://$AMBARI_SERVERHOST:8080/resources/DBConnectionVerification.jar

What is the root cause: - Putting this JAR manually from Ambari Server host to Agents machines can make it work temporarily, but it will not fix the permanent issue. We should findout why the Agents are not able to download this jar from ambari server host. Agents should be able to access ambari server resources using HTTP.

.

avatar
Master Mentor

@Sami Ahmad

Yum uses the proxy settings from "/etc/yum.conf" file by default. It might not be true for other utilities.

Also please check "~/.base_profile" and "~/.profile" scripts define the proxy settings so that it will be applicable globally.

export http_proxy=http://dotatofwproxy.tolls.dot.state.fl.us:8080

.

Another example, please try to validate if your mentioned proxy host and port is working correctly or not by using plain "wget" as following:

wget  -e use_proxy=yes -e http_proxy=http://dotatofwproxy.tolls.dot.state.fl.us:8080   http://$AMBARI_SERVERHOST:8080/resources/DBConnectionVerification.jar

avatar
Super Collaborator

hi Jay I found the issue I am facing but don't know why I am having issue this time and not the first time when I installed this cluster.

here is the issue : wget should not be using proxy server to download the

DBConnectionVerification.jar file because its local server , and for this I have defined "no_proxy" settings in the /etc/environment file as follows :

[root@hadoop1 ~]# cat /etc/environment
http_proxy="http://dotatofwproxy.tolls.dot.state.fl.us:8080/"
https_proxy="https://dotatofwproxy.tolls.dot.state.fl.us:8080/"
ftp_proxy="ftp://dotatofwproxy.tolls.dot.state.fl.us:8080/"
no_proxy=".tolls.dot.state.fl.us,.to.dot.state.fl.us,10.0.0.0/8"

so why does wgetrc not picking up the no_porxy settings ?

avatar
Super Collaborator

thinking back I think I might have installed the cluster first time from all local repositories , in that case I would not need proxy access right ?

this time I installed Ambari using local repository and HDP using public repository .

avatar
Master Mentor

@Sami Ahmad

Yes, you should set the `no_proxy' variable which should contain a comma-separated list of domain extensions proxy should _not_ be used for.

Also from ambari side also you should make sure to set the "-Dhttp.proxyHost" and "-Dhttp.proxyPort" and also for nonProxyHosts to prevent some host names from accessing the proxy server, define the list of excluded hosts, as follows:

-Dhttp.nonProxyHosts=<pipe|separated|list|of|hosts>

Please see: https://docs.hortonworks.com/HDPDocuments/Ambari-2.4.0.0/bk_ambari-reference/content/ch_setting_up_a...

.