Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hive not opening / working ::: Exception in thread "main" java.lang.RuntimeException: org.apache.had

avatar
Contributor

Hello Everyone

I am receiving an error While triggering hive command. I have mentioned the error logoutput at the bottom. However, just for an updates, I followed following below mentioned steps accordingly. Any suggestion will be highly appreciated. Its been a two weeks now i am not been able to solve the issue. Please help.

[hduser@storage Softwares]$ vi ~/.bashrc

# Add Hive bin / directory to PATH
export HIVE_HOME=/home/hduser/Softwares/apache-hive-2.1.1-bin
export PATH=$PATH:$HIVE_HOME/bin/

Created following directory inside hdfs
[hduser@storage ~]$  hadoop fs -ls /user/hive/
17/03/16 18:26:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
drwxrwxr-x   - hduser supergroup          0 2017-03-13 18:23 /user/hive/warehouse
[hduser@storage ~]$

Created Metastore for Hive
[hduser@storage bin]$ mysql -u root -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 2
Server version: 5.1.73 Source distribution

Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> create database hive;
Query OK, 1 row affected (0.05 sec)

mysql> use hive;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A

Database changed
mysql> CREATE USER 'hive'@'storage.castrading.com' IDENTIFIED BY 'hive';
Query OK, 0 rows affected (0.00 sec)

mysql> grant all privileges on hive to 'hive'@'storage.castrading.com' identified by 'hive' with grant option;
Query OK, 0 rows affected (0.08 sec)

mysql> grant all privileges on hive.* to 'hive'@'storage.castrading.com' identified by 'hive' with grant option;
Query OK, 0 rows affected (0.09 sec)

mysql> grant all on *.* to 'hive'@'%' identified by 'hive';
Query OK, 0 rows affected (0.00 sec)

mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)


mysql>

mysql> use hive;
Database changed
mysql>

mysql> show tables;
Empty set (0.00 sec)

mysql> select user,host from mysql.user;
+------------+------------------------+
| user       | host                   |
+------------+------------------------+
|            | %                      |
| %          | %                      |
| retail_dba | %                      |
| root       | %                      |
| root       | 127.0.0.1              |
| retail_dba | 192.168.0.227          |
| root       | 192.168.0.227          |
| root       | localhost              |
| root       | sotrage.castrading.com |
| hive       | storage.castrading.com |
| root       | storage.castrading.com |
+------------+------------------------+
11 rows in set (0.00 sec)

mysql>


mysql> show tables;
+---------------------------+
| Tables_in_hive            |
+---------------------------+
| BUCKETING_COLS            |
| CDS                       |
| COLUMNS_V2                |
| DATABASE_PARAMS           |
| DBS                       |
| DB_PRIVS                  |
| DELEGATION_TOKENS         |
| FUNCS                     |
| FUNC_RU                   |
| GLOBAL_PRIVS              |
| IDXS                      |
| INDEX_PARAMS              |
| KEY_CONSTRAINTS           |
| MASTER_KEYS               |
| NOTIFICATION_LOG          |
| NOTIFICATION_SEQUENCE     |
| NUCLEUS_TABLES            |
| PARTITIONS                |
| PARTITION_EVENTS          |
| PARTITION_KEYS            |
| PARTITION_KEY_VALS        |
| PARTITION_PARAMS          |
| PART_COL_PRIVS            |
| PART_COL_STATS            |
| PART_PRIVS                |
| ROLES                     |
| ROLE_MAP                  |
| SDS                       |
| SD_PARAMS                 |
| SEQUENCE_TABLE            |
| SERDES                    |
| SERDE_PARAMS              |
| SKEWED_COL_NAMES          |
| SKEWED_COL_VALUE_LOC_MAP  |
| SKEWED_STRING_LIST        |
| SKEWED_STRING_LIST_VALUES |
| SKEWED_VALUES             |
| SORT_COLS                 |
| TABLE_PARAMS              |
| TAB_COL_STATS             |
| TBLS                      |
| TBL_COL_PRIVS             |
| TBL_PRIVS                 |
| TYPES                     |
| TYPE_FIELDS               |
| VERSION                   |
+---------------------------+
46 rows in set (0.00 sec)

mysql>

Also copied mysql-connector-java*.jar file inside /apache-hive/lib folder as well

HIVE LOG OUTPUT

hduser@storage ~]$ hive
which: no hbase in (/home/hduser/Softwares/apache-hive-2.1.1-bin/bin:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/)

Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.1.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)
    ... 9 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
    ... 14 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
    ... 23 more
Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://storage.castrading.com:3306/hive?createDatabaseIfNotExist=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)
    at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4417)
    at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1278)
    at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253)
    at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284)
    at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083)
    at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806)
    at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
    at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410)
    at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:208)
    at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
    at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
    at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
    at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
    at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:515)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:544)
    at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:399)
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336)
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:297)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:599)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:564)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:630)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
------

NestedThrowables:
java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://storage.castrading.com:3306/hive?createDatabaseIfNotExist=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)
    at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4417)
    at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1278)
    at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253)
    at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284)
    at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083)
    at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806)
    at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
    at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410)
    at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:208)
    at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
    at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
    at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
    at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
    at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:515)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:544)
    at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:399)
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336)
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:297)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:599)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:564)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:630)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
------

    at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:529)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:834)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:515)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:544)
    at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:399)
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336)
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:297)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:599)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:564)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:630)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
    ... 28 more
Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://storage.castrading.com:3306/hive?createDatabaseIfNotExist=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)
    at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4417)
    at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1278)
    at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253)
    at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284)
    at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083)
    at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806)
    at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
    at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410)
    at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:208)
    at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
    at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
    at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
    at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
    at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:515)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:544)
    at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:399)
    at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336)
    at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:297)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:599)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:564)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:630)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
------

    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
    at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
    at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
    at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
    at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
    at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
    at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420)
    at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821)
    ... 58 more
Caused by: java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)
    at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4417)
    at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1278)
    at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253)
    at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284)
    at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083)
    at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806)
    at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
    at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410)
    at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:208)
    at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
    at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
    ... 70 more
[hduser@storage ~]$

1 ACCEPTED SOLUTION

avatar
Contributor

Hello Everyone

I am now able to solve HIVE ISSUE. Below is what i did. Please do not hesitate if below mentioned steps did not work :-

 

Step 1 :Make Sure you have followed the following steps accordingly. In my case i created hive-site.xml manually as [hduser@storage conf]$ vi hive-site.xml. So i deleted hive-site.xml again and followed following steps.

[hduser@storage conf]$ pwd
/home/hduser/Softwares/apache-hive-2.0.1-bin/conf
[hduser@storage conf]$ cd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf
[hduser@storage conf]$ pwd
/home/hduser/Softwares/apache-hive-2.0.1-bin/conf
[hduser@storage conf]$ cp hive-default.xml.template hive-site.xml

Step 2 :  Create Directory as /tmp/hive

 

Step 3 : Now edit hive-site.xml i.e the one we created at Step 1 above and make sure following information is added or mentioned accordingly inside hive-site.xml

<property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true</value>
    <description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
</property>
<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/user/hive/warehouse</value>
    <description>location of default database for the warehouse</description>
 </property>
<property>

 

<property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>hive</value>
    <description>Username to use against metastore database</description>
  </property>

<property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hive</value>
    <description>password to use against metastore database</description>
  </property>

<property>
   <name>hive.querylog.location</name>
   <value>/tmp/hive</value>
   <description>Location of Hive run time structured log file</description>
</property>

<property>
   <name>hive.exec.local.scratchdir</name>
   <value>/tmp/hive</value>
   <description>Local scratch space for Hive jobs</description>
</property>

<property>
   <name>hive.downloaded.resources.dir</name>
   <value>/tmp/hive</value>
   <description>Temporary local directory for added resources in the remote file system.</description>
</property>

 

 

Step 4 :Make sure that you have added following line inside hive-env.sh

 

export HADOOP_HOME=/home/hduser/hadoop-2.6.5

 

Note : HADOOP_HOME location and Version. Define accordingly

 

Step 5 : Make sure of jps status and then try starting hive.

Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive>

 

 

View solution in original post

9 REPLIES 9

avatar
Champion

@UjjwalRana

 

It seems your issue is related to access "Access denied for user 'APP'@'storage'"

 

Pls refer the below link for answer

http://stackoverflow.com/questions/23040084/hive-connectivity-to-mysql-access-denied-for-user-hivelo...

 

 

avatar
Contributor

Hi Saranvisa,

I followed the url which you have provided but i don't see the issue on that. I am able to login to localhost as well as storagre.castrading.com i.e. mysql output mentioned below.

 

Also Please find the ouput of hive-site.xml from below

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://storage.castrading.com/hive?createDatabaseIfNotExist=true</value>

Note : I even tried giving jdbc:mysql://localhost/hive?createDatabaseIfNotExist=true
    <description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
</property>
<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/user/hive/warehouse</value>
    <description>location of default database for the warehouse</description>
 </property>
<property>
<name>javax.jdo.optioin.ConnectionUserName</name>
<value>hive</value>
<description>MYSQL username</description>
</property>
<property>
<name>javax.jdo.optioin.ConnectionPassword</name>
<value>hive</value>
<description>MYSQL Password</description>
</property>
<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>true</value>
</property>
<property>
  <name>datanucleus.fixedDatastore</name>
  <value>true</value>
</property>
<property>
 <name>datanucleus.autoCreateTables</name>
 <value>True</value>
 </property>
</configuration>

 

[hduser@storage ~]$ mysql -u hive -p -hlocalhost
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 208
Server version: 5.1.73 Source distribution

Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> \q
Bye


[hduser@storage ~]$ mysql -u hive -p -hstorage.castrading.com
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 209
Server version: 5.1.73 Source distribution

Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql>

 

 

[hduser@storage ~]$ mysql -u hive -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 214
Server version: 5.1.73 Source distribution

Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql> show databases;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| hive               |
| metastore          |
| mysql              |
| retail_db          |
| retail_db_backup   |
+--------------------+
6 rows in set (0.10 sec)

mysql>

avatar
Champion

 

<property>
    <name>javax.jdo.option.ConnectionURL</name>
     <value>jdbc:mysql://localhost/storage.castrading.com</value>
</property>

 

Could you please try the above . Also it clear tells that USER APP does dont have privilege in storage.castrading.com  . I see you have created USER hive and give privilege to  storage.castrading.com. use hive user you should be able to supress this error.

 

Copying it from your post . I dont see USER APP in the below 

 

mysql> select user,host from mysql.user;
+------------+------------------------+
| user       | host                   |
+------------+------------------------+
|            | %                      |
| %          | %                      |
| retail_dba | %                      |
| root       | %                      |
| root       | 127.0.0.1              |
| retail_dba | 192.168.0.227          |
| root       | 192.168.0.227          |
| root       | localhost              |
| root       | sotrage.castrading.com |
| hive       | storage.castrading.com |
| root       | storage.castrading.com |

avatar
Contributor

My understanding is it should take the username and password hive i.e. the one defined inside hive-site.xml. So If you look at hive-site.xml. Username and Password is mentioned there as HIVE but i am not been able to understand from where it is taking APP user. Also the location for hive-site.xml is inside conf directory. By the way i tried  with following

<property>
    <name>javax.jdo.option.ConnectionURL</name>
     <value>jdbc:mysql://localhost/storage.castrading.com</value>
</property>

 

And also tried with following too but no any progress

<property>
    <name>javax.jdo.option.ConnectionURL</name>
     <value>jdbc:mysql://storage.castrading.com/hive?createDatabaseIfNotExist=true</value>
</property>

Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://localhost/storage.castrading.com, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Access denied for user 'APP'@'localhost' (using password: YES)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906)

 

hive-site.xml location

[hduser@storage conf]$ pwd
/home/hduser/Softwares/apache-hive-2.1.1-bin/conf
[hduser@storage conf]$ ls -ltr
total 276
-rw-r--r--. 1 hduser hadoop   2662 Nov 29 05:32 parquet-logging.properties
-rw-r--r--. 1 hduser hadoop   2060 Nov 29 05:32 ivysettings.xml
-rw-r--r--. 1 hduser hadoop   2925 Nov 29 05:32 hive-log4j2.properties.template
-rw-r--r--. 1 hduser hadoop   2925 Nov 29 05:32 hive-log4j2 properties
-rw-r--r--. 1 hduser hadoop   1596 Nov 29 05:32 beeline-log4j2.properties.template
-rw-r--r--. 1 hduser hadoop   2719 Nov 29 05:32 llap-cli-log4j2.properties.template
-rw-r--r--. 1 hduser hadoop   2274 Nov 29 05:32 hive-exec-log4j2.properties.template
-rw-r--r--. 1 hduser hadoop   4353 Nov 29 05:35 llap-daemon-log4j2.properties.template
-rw-r--r--. 1 hduser hadoop   2378 Nov 29 05:35 hive-env.sh.template
-rw-r--r--. 1 hduser hadoop 229198 Nov 30 03:46 hive-default.xml.template
-rw-r--r--. 1 hduser hadoop   2378 Mar 15 14:20 hive-env.sh~
-rw-r--r--. 1 hduser hadoop   1918 Mar 15 19:07 hive-site.xmlbackup
-rw-r--r--. 1 hduser hadoop   1913 Mar 18 08:26 hive-site.xml


[hduser@storage conf]$

hive-site.xml output

 

<configuration>
<property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://storage.castrading.com/hive?createDatabaseIfNotExist=true</value>

Note : I even tried giving jdbc:mysql://localhost/hive?createDatabaseIfNotExist=true
    <description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
</property>
<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/user/hive/warehouse</value>
    <description>location of default database for the warehouse</description>
 </property>
<property>
<name>javax.jdo.optioin.ConnectionUserName</name>
<value>hive</value>
<description>MYSQL username</description>
</property>
<property>
<name>javax.jdo.optioin.ConnectionPassword</name>
<value>hive</value>
<description>MYSQL Password</description>
</property>
<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>true</value>
</property>
<property>
  <name>datanucleus.fixedDatastore</name>
  <value>true</value>
</property>
<property>
 <name>datanucleus.autoCreateTables</name>
 <value>True</value>
 </property>
</configuration>

 

[hduser@storage conf]$  schematool -dbType mysql -info
which: no hbase in (/home/hduser/Softwares/apache-hive-2.1.1-bin/bin:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/)
Metastore connection URL:     jdbc:mysql://storage.castrading.com/hive?createDatabaseIfNotExist=true
Metastore Connection Driver :     com.mysql.jdbc.Driver
Metastore connection User:     APP
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: java.sql.SQLException : Access denied for user 'APP'@'storage' (using password: YES)
SQL Error code: 1045
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
[hduser@storage conf]$

avatar
Contributor

Hello Everyone

Any help will be highly appreciated for above mentioned issue. I am still stuck with the issue from many days. If anyone could provide any feedback then it will be great help and much more thankful.

 

Thanks

 

Ujjwal Rana

avatar
Contributor

Hi Again

As per the url which you have given i trigger the mysql statement accordingly but still i am getting same issue. Below is what i did. Please help. Will look forward to hear from you.

 

As per the above url. Below is what i did but still it showing same issue. Please help i am stuck almost three weeks now. I don't see any help from anyone. Will wait for your feedback

mysql> grant all on *.* to 'hive'@'192.168.0.227' identified by 'hive';
Query OK, 0 rows affected (0.00 sec)

mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)
 
[hduser@storage lib]$ mysql -u hive -h 192.168.0.227 -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 13
Server version: 5.1.73 Source distribution

Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.

Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.

Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

mysql>


[hduser@storage Desktop]$ hive
which: no hbase in (/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/)

Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.1.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
   
Caused by: java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)
  
    at java.sql.DriverManager.getConnection(DriverManager.java:208)
    at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
    at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
    ... 70 more
[hduser@storage Desktop]$

Also please find the output of hive-site.xml again

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true</value>
    <description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
</property>
<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/user/hive/warehouse</value>
    <description>location of default database for the warehouse</description>
 </property>
<property>
<name>javax.jdo.optioin.ConnectionUserName</name>
<value>hive</value>
<description>MYSQL username</description>
</property>
<property>
<name>javax.jdo.optioin.ConnectionPassword</name>
<value>hive</value>
<description>MYSQL Password</description>
</property>
<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>true</value>
</property>
<property>
  <name>datanucleus.fixedDatastore</name>
  <value>true</value>
</property>
<property>
 <name>datanucleus.autoCreateTables</name>
 <value>True</value>
 </property>
</configuration>

avatar
Contributor

Hello Everyone

I am now able to solve HIVE ISSUE. Below is what i did. Please do not hesitate if below mentioned steps did not work :-

 

Step 1 :Make Sure you have followed the following steps accordingly. In my case i created hive-site.xml manually as [hduser@storage conf]$ vi hive-site.xml. So i deleted hive-site.xml again and followed following steps.

[hduser@storage conf]$ pwd
/home/hduser/Softwares/apache-hive-2.0.1-bin/conf
[hduser@storage conf]$ cd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf
[hduser@storage conf]$ pwd
/home/hduser/Softwares/apache-hive-2.0.1-bin/conf
[hduser@storage conf]$ cp hive-default.xml.template hive-site.xml

Step 2 :  Create Directory as /tmp/hive

 

Step 3 : Now edit hive-site.xml i.e the one we created at Step 1 above and make sure following information is added or mentioned accordingly inside hive-site.xml

<property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true</value>
    <description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
</property>
<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/user/hive/warehouse</value>
    <description>location of default database for the warehouse</description>
 </property>
<property>

 

<property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>hive</value>
    <description>Username to use against metastore database</description>
  </property>

<property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hive</value>
    <description>password to use against metastore database</description>
  </property>

<property>
   <name>hive.querylog.location</name>
   <value>/tmp/hive</value>
   <description>Location of Hive run time structured log file</description>
</property>

<property>
   <name>hive.exec.local.scratchdir</name>
   <value>/tmp/hive</value>
   <description>Local scratch space for Hive jobs</description>
</property>

<property>
   <name>hive.downloaded.resources.dir</name>
   <value>/tmp/hive</value>
   <description>Temporary local directory for added resources in the remote file system.</description>
</property>

 

 

Step 4 :Make sure that you have added following line inside hive-env.sh

 

export HADOOP_HOME=/home/hduser/hadoop-2.6.5

 

Note : HADOOP_HOME location and Version. Define accordingly

 

Step 5 : Make sure of jps status and then try starting hive.

Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive>

 

 

avatar
New Contributor

Hi @UjjwalRana 

I'm getting the same error after following all your steps...
I am able to login to hive shell earlier too but when I try to create a tables, I am getting below error

show tables;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

avatar
Community Manager

@Venkat_ as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.



Regards,

Vidya Sargur,
Community Manager


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community: