Created on 08-07-2018 01:41 PM - edited 09-16-2022 06:34 AM
I have the wikiticker database already set up on druid, and i want to access it with hive.
My environment is a HDP 2.6.5 sanbox on vmware
i'm connecting to hive2 using beeline, and those are the commands i use:
!connect jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2
add jar /usr/hdp/2.6.5.0-292/hive2/lib/hive-druid-handler.jar;
CREATE EXTERNAL TABLE druid_table_1 STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ("druid.datasource" = "wikiticker");
The error i get is:
Error: org.apache.thrift.transport.TTransportException (state=08S01,code=0)
After that, every command i type gives the same error, even "show tables".
In hiveserver2.log i found those lines:
2018-08-07 13:11:36,622 ERROR [HiveServer2-Background-Pool: Thread-22099]: bonecp.ConnectionHandle (ConnectionHandle.java:markPossiblyBroken(388)) - Database access problem. Killing off this connection and all remaining connections in the connection pool. SQL State = 08S01 2018-08-07 13:11:36,622 ERROR [HiveServer2-Background-Pool: Thread-22099]: txn.TxnHandler (TxnHandler.java:getDbConn(1978)) - There is a problem with a connection from the pool, retrying(rc=9): Communications link failure
I get the same error using the Hive View 2.0 in Ambari.
Thanks
Created 08-10-2018 04:29 PM
Hi @Michele Proverbio!
Got it, could you confirm if you're using Hive LLAP (hive interactive) to access the DruidStorageHandler?
If you aren't, then try to follow these steps:
Ambari > Hive > Interactive Query On > Choose a node to install the Hive Interactive > Save > Restart
Then, go again to AMBARI > Hive > HiveServer2 Interactive JDBC and copy the JDBC link.
After, run the same create table using the DruidStorageHandler.
Hope this helps!
Created 08-07-2018 09:43 PM
Hi @Michele Proverbio!
Guess you're having issues with your DB used for Hive Metastore, could you check if the service it's up? And check if you can connect and access hive metastore tables. To confirm this, let's first check if you're able to connect through beeline without ZK.
beeline -u 'jdbc:hive2://sandbox-hdp.hortonworks.com:10000/default'
Otherwise, I'd say to check your DB setup.
Ps: if you're not aware of your DB parameters for Hive, you can follow this steps:
Go to Ambari > Hive > Configurations > Hive Metastore
or you can try to run the following commands:
#Example using hiveCLI
hive -e "set; " > /tmp/hive.properties
cat /tmp/hive.properties | grep -i "javax.jdo.option." #Example using beeline beeline -u 'jdbc:hive2://localhost:10000/default' -e "set;" > /tmp/beeline.properties cat /tmp/beeline.properties | grep -i "javax.jdo.option."
Hope this helps!
Created 08-08-2018 06:29 AM
Hi @Vinicius Higa Murakami, thanks for your reply.
My mysqld daemon is up and running.
I am able to connect to hive without ZK:
0: jdbc:hive2://sandbox-hdp.hortonworks.com:1> show tables; +-----------+--+ | tab_name | +-----------+--+ +-----------+--+ No rows selected (0.375 seconds)
and those are the metastore properties:
hive.conf.hidden.list=javax.jdo.option.ConnectionPassword,hive.server2.keystore.password javax.jdo.option.ConnectionDriverName=com.mysql.jdbc.Driver javax.jdo.option.ConnectionURL=jdbc:mysql://sandbox-hdp.hortonworks.com/hive?createDatabaseIfNotExist=true javax.jdo.option.ConnectionUserName=root javax.jdo.option.DetachAllOnCommit=true javax.jdo.option.Multithreaded=true javax.jdo.option.NonTransactionalRead=true
Any idea?
Thanks
Created 08-08-2018 06:35 PM
Hmmm @Michele Proverbio, it seems you're using root as the connectionUserName.
javax.jdo.option.ConnectionUserName=root
Could you try to connect with the following command and rerun the DDL (create external table)
beeline -u 'jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2' -n root
I'm expecting to see errors, but a different error this time (permission to write on HDFS for root).
Usually, when we set up the DB for HiveMetastore, we use the hive user for as the owner of the Connection.
https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.2.2/bk_ambari-administration/content/using_hive...
I'm not sure if this is related to your issue, but I think it's worth to try 🙂
Hope this helps!
Created 08-09-2018 08:05 AM
i tried to use your connection string and tried to create the external table, but the same errors occurs.
Error: org.apache.thrift.transport.TTransportException (state=08S01,code=0
As stated in the original post it breaks every successive query.
Created 08-09-2018 02:47 PM
Hello @Michele Proverbio!
Just asking but.. what happens if you create a simple table without the org.apache.hadoop.hive.druid.DruidStorageHandler?
Does it happen the same issue?
If so, what we can do is check for errors/warnings on the hivemetastore logs.
Otherwise, I'd say to enable the DEBUG for the HS2 log. And try to figure out what's going on with your HM.
Hope this helps!
Created 08-10-2018 08:16 AM
thanks for your responses once again.
i'm able to create tables without the druidstoragehandler.
as you recommended i enabled the DEBUG logging level and those are the informations i could retrieve from the logs:
hive-server2.log:
WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification
I dunno if this can be of any help, but that's the only warning there is in the file
hivemetastore.log
2018-08-10 07:11:37,134 ERROR [Thread-18]: bonecp.ConnectionHandle (ConnectionHandle.java:markPossiblyBroken(388)) - Database access problem. Killing off this connection and all remaining connections in the connection pool. SQL State = 08S01 2018-08-10 07:11:37,135 ERROR [Thread-18]: txn.TxnHandler (TxnHandler.java:getDbConn(1978)) - There is a problem with a connection from the pool, retrying(rc=9): Communications link failure The last packet successfully received from the server was 15,572 milliseconds ago. The last packet sent successfully to the server was 12 milliseconds ago. (SQLState=08S01, ErrorCode=0) com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 15,572 milliseconds ago. The last packet sent successfully to the server was 12 milliseconds ago. at sun.reflect.GeneratedConstructorAccessor67.newInstance(Unknown Source) ... at org.apache.hadoop.hive.ql.txn.compactor.Cleaner.run(Cleaner.java:81) Caused by: java.io.EOFException: Can not read response from server. Expected to read 4 bytes, read 0 bytes before connection was unexpectedly lost. at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:3011) at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3469) ... 10 more 2018-08-10 07:11:37,164 ERROR [BoneCP-pool-watch-thread]: bonecp.BoneCP (BoneCP.java:obtainInternalConnection(292)) - Failed to acquire connection to jdbc:mysql://sandbox-hdp.hortonworks.com/hive?createDatabaseIfNotExist=true. Sleeping for 7000 ms. Attempts left: 5 com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure<br>
this is the error where the connection fails, but i didn't have any luck googling the error. Maybe you guys have seen this in the past.
Thanks
Created 08-10-2018 04:29 PM
Hi @Michele Proverbio!
Got it, could you confirm if you're using Hive LLAP (hive interactive) to access the DruidStorageHandler?
If you aren't, then try to follow these steps:
Ambari > Hive > Interactive Query On > Choose a node to install the Hive Interactive > Save > Restart
Then, go again to AMBARI > Hive > HiveServer2 Interactive JDBC and copy the JDBC link.
After, run the same create table using the DruidStorageHandler.
Hope this helps!
Created 08-16-2018 02:42 PM
Did you have a chance to look at my last answer?
Created 08-22-2018 08:35 AM
That worked. After installing the hive interact module i was able to create the external table.
Thank you so much!
Sorry for the late response but i went on vacation 😛