Created 05-13-2016 09:25 PM
anyone able to shed any light on this. I'm starting up a cluster using cloudbreak/ambari with a blueprint, and external metastoredb, but When I try to run very simple operations via beeline (eg "show schemas;") I get an error every second try
0: jdbc:hive2://x.x.x.x:10000> show schemas; Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. com/mysql/jdbc/SQLError (state=08S01,code=1)"
See hiveserver2 logs below but if I restart hiveserver2 it just goes away. I can reproduce this reliably any ideas??
2016-05-13 20:36:51,782 INFO [HiveServer2-Background-Pool: Thread-268]: metastore.ObjectStore (ObjectStore.java:initialize(294)) - ObjectStore, initialize called 2016-05-13 20:36:51,786 ERROR [HiveServer2-Background-Pool: Thread-268]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invoke(159)) - java.lang.NoClassDefFoundError: com/mysql/jdbc/SQLError at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3575) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151) at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619) at com.mysql.jdbc.StatementImpl.executeSimpleNonQuery(StatementImpl.java:1606) at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1503) at com.mysql.jdbc.ConnectionImpl.getTransactionIsolation(ConnectionImpl.java:3173) at com.jolbox.bonecp.ConnectionHandle.getTransactionIsolation(ConnectionHandle.java:825) at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:444) at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getXAResource(ConnectionFactoryImpl.java:378) at org.datanucleus.store.connection.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:328) at org.datanucleus.store.connection.AbstractConnectionFactory.getConnection(AbstractConnectionFactory.java:94) at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:430) at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:396) at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:621) at org.datanucleus.store.query.Query.executeQuery(Query.java:1786) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672) at org.datanucleus.store.query.Query.execute(Query.java:1654) at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221) at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:192) at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:138) at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:300) at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:263) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:603) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:581) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:1188) at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) at com.sun.proxy.$Proxy11.get_all_databases(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1037) at sun.reflect.GeneratedMethodAccessor31.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156) at com.sun.proxy.$Proxy12.getAllDatabases(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1237) at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2262) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:390) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1728) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1485) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1262) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1126) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1121) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154) at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-05-13 20:36:51,787 ERROR [HiveServer2-Background-Pool: Thread-268]: exec.DDLTask (DDLTask.java:failed(525)) - java.lang.NoClassDefFoundError: com/mysql/jdbc/SQLError at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3575) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151) at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619) at com.mysql.jdbc.StatementImpl.executeSimpleNonQuery(StatementImpl.java:1606) at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1503) at com.mysql.jdbc.ConnectionImpl.getTransactionIsolation(ConnectionImpl.java:3173) at com.jolbox.bonecp.ConnectionHandle.getTransactionIsolation(ConnectionHandle.java:825) at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:444) at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getXAResource(ConnectionFactoryImpl.java:378) at org.datanucleus.store.connection.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:328) at org.datanucleus.store.connection.AbstractConnectionFactory.getConnection(AbstractConnectionFactory.java:94) at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:430) at org.datanucleus.store.AbstractStoreManager.getConnection(AbstractStoreManager.java:396) at org.datanucleus.store.rdbms.query.JDOQLQuery.performExecute(JDOQLQuery.java:621) at org.datanucleus.store.query.Query.executeQuery(Query.java:1786) at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672) at org.datanucleus.store.query.Query.execute(Query.java:1654) at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221) at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:192) at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:138) at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:300) at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:263) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:603) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:581) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_all_databases(HiveMetaStore.java:1188) at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) at com.sun.proxy.$Proxy11.get_all_databases(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getAllDatabases(HiveMetaStoreClient.java:1037) at sun.reflect.GeneratedMethodAccessor31.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156) at com.sun.proxy.$Proxy12.getAllDatabases(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1237) at org.apache.hadoop.hive.ql.exec.DDLTask.showDatabases(DDLTask.java:2262) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:390) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1728) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1485) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1262) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1126) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1121) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154) at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-05-13 20:36:51,787 INFO [HiveServer2-Background-Pool: Thread-268]: hooks.ATSHook (ATSHook.java:<init>(90)) - Created ATS Hook 2016-05-13 20:36:51,787 INFO [HiveServer2-Background-Pool: Thread-268]: log.PerfLogger (PerfLogger.java:PerfLogBegin(135)) - <PERFLOG method=FailureHook.org.apache.hadoop.hive.ql.hooks.ATSHook from=org.apache.hadoop.hive.ql.Driver> 2016-05-13 20:36:51,787 INFO [HiveServer2-Background-Pool: Thread-268]: log.PerfLogger (PerfLogger.java:PerfLogEnd(162)) - </PERFLOG method=FailureHook.org.apache.hadoop.hive.ql.hooks.ATSHook start=1463171811787 end=1463171811787 duration=0 from=org.apache.hadoop.hive.ql.Driver> 2016-05-13 20:36:51,787 ERROR [HiveServer2-Background-Pool: Thread-268]: ql.Driver (SessionState.java:printError(962)) - FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. com/mysql/jdbc/SQLError 2016-05-13 20:36:51,788 INFO [HiveServer2-Background-Pool: Thread-268]: ql.Driver (Driver.java:execute(1629)) - Resetting the caller context to 2
Created 06-10-2016 04:20 PM
Hi guys. This is due to an old SQL driver - as stated above you can either change the driver or wait until the next release where the driver will be updated. Sorry for the inconvenience.
Created 06-10-2016 04:20 PM
Hi guys. This is due to an old SQL driver - as stated above you can either change the driver or wait until the next release where the driver will be updated. Sorry for the inconvenience.