Created 10-15-2016 01:42 AM
I'm trying out Hive 1.2.1 on HDP 2.5.0, without Hive transactions. I'm able to create a Hive database, but not able to drop it. I get the error pasted below. I suspect it is due to HIVE-10632 being cherry-picked from Hive 1.3 to Hive 1.2 (source). Has anyone successfully dropped a database on HDP 2.5.0? I've not run into this issue with previous versions of HDP (such as HDP 2.1, 2.2, 2.3, 2.4).
MetaException(message:Unable to clean up com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'hive.TXN_COMPONENTS' doesn't exist at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:411) at com.mysql.jdbc.Util.getInstance(Util.java:386) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151) at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619) at com.mysql.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1709) at com.mysql.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1628) at org.apache.commons.dbcp.DelegatingStatement.executeUpdate(DelegatingStatement.java:228) at org.apache.commons.dbcp.DelegatingStatement.executeUpdate(DelegatingStatement.java:228) at org.apache.hadoop.hive.metastore.txn.TxnHandler.cleanupRecords(TxnHandler.java:1747) at org.apache.hadoop.hive.metastore.AcidEventListener.onDropDatabase(AcidEventListener.java:51) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_database_core(HiveMetaStore.java:1183) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_database(HiveMetaStore.java:1215) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:139) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:97) at com.sun.proxy.$Proxy14.drop_database(Unknown Source) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_database.getResult(ThriftHiveMetastore.java:8989) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_database.getResult(ThriftHiveMetastore.java:8973) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110) at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) ) at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:335) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:199) at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76) at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)
Created 10-17-2016 05:01 PM
I'm using the HDP Sandbox 2.5 that includes Hive 1.2.1.2.5. I made sure that transactions are off. Using the Hive View, I just created and dropped a test database.
Created 10-17-2016 05:01 PM
I'm using the HDP Sandbox 2.5 that includes Hive 1.2.1.2.5. I made sure that transactions are off. Using the Hive View, I just created and dropped a test database.
Created 10-17-2016 05:59 PM
Thanks for validating that, @bhagan. Any idea why 'drop database' would be failing for me? I have concurrency enabled, but I'm not sure that matters. I haven't found this error message elsewhere on the internet, but it appears to be the case that it is attempting to use a transactional table, even though I have transactions turned off. Who is even responsible for creating these tables? Are they created even when transactions are turned off? Thanks, Ali Anwar
Created 10-18-2016 02:14 PM
Ali, in my Sandbox, I have concurrency set to true.
My understanding is that the transaction tables are created during hive installation even if transactions are not configured to be on.
For curiosity, I'm wondering how you installed Hive. And I'm wondering if you could take a look at Hive's table listing to see if the table doesn't exist and if other transaction tables don't exist. If some are missing, there could have been an issue during installation.
Created 10-18-2016 06:49 PM
I haven't confirmed it completely, but very likely, my issue is an installation issue. Thanks for responding.