Member since
03-09-2017
8
Posts
0
Kudos Received
0
Solutions
03-31-2017
10:29 AM
Ta Deepesh - that fixed it. At least I can now get one step further... at least schematool now loads the jdbc drivers before abending [on a 'connection refused' exception even though I can log into the MySQL db from the command line... sigh]. Thanks a million...
By the way, I put a symlink into HIVE_HOME/lib for the file (as opposed to physically copying it) - having multiple physical copies of something like a jdbc driver JAR creates too many opportunities for versions to get out of synch for my liking.
... View more
03-30-2017
01:28 PM
I have a functioning HDP cluster that seems to be working perfectly except for Hive and Oozie,
which will not start. The problems with these two cropped up at install time,
but I (chose to) ignore them at the time as I did not need / want to use them
immediately (all I needed at first were basic Hadoop things like HDFS, Sqoop
and Flume, Map-Reduce, and so forth) so I installed these and left the problems
for later. I am now revisiting these ‘components’ as I need to find (and
fix) whatever is going wrong. I have started with Hive, and worked my way up
the problem chain, starting with the Hive Metastore failing to start and
working backward. To cut a long story very short, the step in the chain that is
failing is the schema-tool initialization of the Hive meta-store database
schema, and here the underlying (root) cause seems to be a jdbc problem – it fails
to load the drivers for some reason (relevant pats of the error message are): /usr/hdp/current/hive-client/bin/schematool -initSchema -dbType mysql
Metastore
connection URL:jdbc:mysql://hdata2.edi.local/hive
Metastore
Connection Driver :com.mysql.jdbc.Driver
Metastore
connection User:hive
org.apache.hadoop.hive.metastore.HiveMetaException:
Failed to load driver
Underlying
cause: java.lang.ClassNotFoundException : com.mysql.jdbc.Driver
***
schemaTool failed ***
Other programs - like sqoop for example - can connect to this
MySQL database over jdbc, so I cannot figure out why the Hive schema-tool
cannot. The errors messages suggest that it does not find the jdbc jar file,
but the right symlink is in /usr/share/java and everybody else seems to find it
without problem. I have v5.1.28 of the Oracle mysql-connector-java jar file which
is the current version, so it’s not a problem with back-level files either. I have MySQL installed on the ‘master’ node of the cluster; Ambari
is happily using it, I can drive it from the CLI, can read and write data to it
with flume, etc. So I know that it is installed correctly and working, that I
can connect to it from all the nodes in my cluster, and that it also has a
(empty) database for Hive to use, has users created and permission grants in
place, etc. So, I cannot figure out what the underlying problem is, and
I could use suggestions as to possible causes/solutions. Does Hive (well, the
Hive schema tool) need to have jdbc somewhere special? Regards, and thanks in advance to all, Rick
... View more
Labels:
- Labels:
-
Apache Hive
03-10-2017
11:11 AM
Thanks Jay, but problem solved. See below.
... View more
03-10-2017
11:10 AM
Hey Jonathan - thanks for the reply. You put your finger on something that had not occurred to me - your answer was correct - the port was both open and closed at the same time. Turns out to have been a DNS problem - somebody had updated a local DNS with a dud record. Those machines that then used that DNS to resolve hostnames etc got the (wrong) address of a machine that did not have 3306 open. So, they failed - including the Ambari server machine. Machines that happen to have been up for a while had the correct info cached locally, so could connect. Problem solved, and a mystery no more. I must confess that it had me tearing my hair out for a while....
Rick
... View more
03-09-2017
04:28 PM
Hi All - I am trying to install HDP through Ambari on a brand-new
cluster (runnng Ubuntu) and am getting abends when I try to start the
ambari-server. I get the following messages: No errors were found.
ERROR: Exiting with exit code 1.
REASON:
Database check failed to complete. Please check
/var/log/ambari-server/ambari-server.log and
/var/log/ambari-server/ambari-server-check-database.log for more
information. When I check the ambari-server.log I find: 09 Mar 2017 16:09:52,356 ERROR [main] DBAccessorImpl:109 - Error while creating database accessor
com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception:
** BEGIN NESTED EXCEPTION **
java.net.ConnectException
MESSAGE: Connection refused
STACKTRACE:
java.net.ConnectException: Connection refused
(large number of subsequent traceback lines omitted) From the ambari-server-database log I get: Ambari database consistency check started...
java.net.ConnectException
MESSAGE: Connection refused
STACKTRACE:
java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at java.net.Socket.connect(Socket.java:538)
at java.net.Socket.<init>(Socket.java:434)
at java.net.Socket.<init>(Socket.java:244)
at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:256)
at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:271)
at com.mysql.jdbc.Connection.createNewIO(Connection.java:2771) Anybody got any ideas why this might be happening????
The
DB that I am targetting is MySQL - it's an install that works ok, that I
can get into via the CLI and use normally, the ports seem to be open (I
can telnet into it), JDBC is installed, and so forth. So, the DB seems
to be functioning fine.
I am both baffled and needing help from the community.... any suggestions / help welcomed !
Rick
... View more
Labels:
- Labels:
-
Apache Ambari