Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

HIVE connection refused critical alert

avatar
Super Collaborator

I have only one alert (critical) in the system , coming from HIVE and only thing that's not working is I cant connect to hive2 database using beeline. I can connect to hive user using MySQL and I can see all the tables under hive db .

the error I am seeing in Ambari is follows:

(err_msg) Fail: Execution of '! beeline -u
'jdbc:hive2://hadoop2.tolls.dot.state.fl.us:10000/;transportMode=binary' -e ''
2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL''
returned 1. Error: Could not open client transport with JDBC Uri:
jdbc:hive2://hadoop2.tolls.dot.state.fl.us:10000/;transportMode=binary:
java.net.ConnectException: Connection refused (state=08S01,code=0) Error: Could
not open client transport with JDBC Uri:
jdbc:hive2://hadoop2.tolls.dot.state.fl.us:10000/;transportMode=binary:
java.net.ConnectException: Connection refused (state=08S01,code=0) ) 


1 ACCEPTED SOLUTION

avatar
Super Collaborator

Gouri you were right it was the privileges issue on Linux /tmp/hive folder , I was changing the permission of the hdfs /tmp/hive folder . I can access beeline now and can connect to the hive store , I have other issues though for which I will open a new post.

thanks for your help

View solution in original post

16 REPLIES 16

avatar
Super Collaborator

I added the permissions for hive user on /tmp/hive , still no luck ..

[hive@hadoop2 ~]$ id
uid=502(hive) gid=502(hadoop) groups=502(hadoop)
[hive@hadoop2 ~]$
[hive@hadoop2 ~]$ uname -a > a.a
[hive@hadoop2 ~]$ hdfs dfs -copyFromLocal a.a /tmp/hive/b.b
[hive@hadoop2 ~]$

avatar
Super Collaborator

I also tried this command , looks like something is not good in the database ?

[root@hadoop2 java]# metatool -listFSRoot
WARNING: Use "yarn jar" to launch YARN applications.
Initializing HiveMetaTool..
16/09/13 15:24:05 INFO metastore.ObjectStore: ObjectStore, initialize called
16/09/13 15:24:05 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
16/09/13 15:24:05 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
16/09/13 15:24:05 ERROR Datastore.Schema: Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:mysql://hadoop2.tolls.dot.state.fl.us/hive?createDatabaseIfNotExist=true, username = hive. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Access denied for user 'hive'@'hadoop2.tolls.dot.state.fl.us' (using password: YES)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)

avatar
Super Collaborator

Can you run this, So that we can exactly know whats gi\oing on

/usr/bin/hive --service hiveserver2 --hiveconf hive.root.logger=DEBUG

avatar
Super Collaborator

hi gouri the hiveserver2 is already running on the hive node hadoop2. you want me to kill the process and run the one you gave in nohup mode so it runs from background?

[root@hadoop2 hive]# ps -ef | grep hiveserver2
hive      8474     1  2 16:33 ?        00:00:09 /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.111.x86_64/bin/java -Xmx1024m -Dhdp.version=2.4.3.0-227 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.4.3.0-227 -Dhadoop.log.dir=/var/log/hadoop/hive -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.4.3.0-227/hadoop -Dhadoop.id.str=hive -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.4.3.0-227/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -XX:MaxPermSize=512m -Xmx4467m -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.4.3.0-227/hive/lib/hive-service-1.2.1000.2.4.3.0-227.jar org.apache.hive.service.server.HiveServer2 --hiveconf hive.aux.jars.path=file:///usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar -hiveconf hive.metastore.uris=  -hiveconf hive.log.file=hiveserver2.log -hiveconf hive.log.dir=/var/log/hive
root     10843  2790  0 16:40 pts/0    00:00:00 grep hiveserver2
[root@hadoop2 hive]#

avatar
Super Collaborator

Please run below command as hive user: /usr/bin/hive --service hiveserver2 --hiveconf hive.root.logger=DEBUG . yes kill hiveserver2 process on hadoop2 and run this on the same node.

avatar
Super Collaborator

hive.zipI bounced all the servers and restarted all components , attaching the new log files can you please see if you still see the hive user permission issues on /tmp/hive ? I am still getting permission denied.

avatar
Super Collaborator

Gouri you were right it was the privileges issue on Linux /tmp/hive folder , I was changing the permission of the hdfs /tmp/hive folder . I can access beeline now and can connect to the hive store , I have other issues though for which I will open a new post.

thanks for your help