Member since
10-14-2016
4
Posts
1
Kudos Received
0
Solutions
10-23-2017
08:11 PM
I intend to use org.apache.hadoop.hive.ql.hooks.PreExecute to insert some custom code before a query runs. My use case is that I have a custom HiveStorageHandler, which requires custom classes as well as a custom set of configurations for each query. I intend to use my PreExecute hook to setup these configurations based upon the query and write these configurations to some place where the query can access it (perhaps somewhere on HDFS). This PreExecute hook will also be responsible for setting up the classpath for the custom query to use. Currently, I'm trying to do it by modifying the hive reloadable jars path at runtime, but that doesn't seem to be working (see below for sample code). Is there any way that would work to implement this? @Override
public void run(SessionState sess, Set<ReadEntity> inputs, Set<WriteEntity> outputs,
UserGroupInformation ugi) throws Exception {
String reloadableJars = sess.getConf().get(HiveConf.ConfVars.HIVERELOADABLEJARS.toString());
sess.getConf().set(HiveConf.ConfVars.HIVERELOADABLEJARS.toString(),
reloadableJars + ",file:///opt/custom/lib/custom-jar-1.0.0.jar");
sess.reloadAuxJars();
}
... View more
Labels:
- Labels:
-
Apache Hive
10-18-2016
06:49 PM
I haven't confirmed it completely, but very likely, my issue is an installation issue.
Thanks for responding.
... View more
10-17-2016
05:59 PM
Thanks for validating that, @bhagan.
Any idea why 'drop database' would be failing for me? I have concurrency enabled, but I'm not sure that matters.
I haven't found this error message elsewhere on the internet, but it appears to be the case that it is attempting to use a transactional table, even though I have transactions turned off. Who is even responsible for creating these tables? Are they created even when transactions are turned off?
Thanks,
Ali Anwar
... View more
10-15-2016
01:42 AM
1 Kudo
I'm trying out Hive 1.2.1 on HDP 2.5.0, without Hive transactions.
I'm able to create a Hive database, but not able to drop it. I get the error pasted below.
I suspect it is due to HIVE-10632 being cherry-picked from Hive 1.3 to Hive 1.2 (source).
Has anyone successfully dropped a database on HDP 2.5.0?
I've not run into this issue with previous versions of HDP (such as HDP 2.1, 2.2, 2.3, 2.4).
MetaException(message:Unable to clean up com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'hive.TXN_COMPONENTS' doesn't exist
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.Util.getInstance(Util.java:386)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1709)
at com.mysql.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1628)
at org.apache.commons.dbcp.DelegatingStatement.executeUpdate(DelegatingStatement.java:228)
at org.apache.commons.dbcp.DelegatingStatement.executeUpdate(DelegatingStatement.java:228)
at org.apache.hadoop.hive.metastore.txn.TxnHandler.cleanupRecords(TxnHandler.java:1747)
at org.apache.hadoop.hive.metastore.AcidEventListener.onDropDatabase(AcidEventListener.java:51)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_database_core(HiveMetaStore.java:1183)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_database(HiveMetaStore.java:1215)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:139)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:97)
at com.sun.proxy.$Proxy14.drop_database(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_database.getResult(ThriftHiveMetastore.java:8989)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$drop_database.getResult(ThriftHiveMetastore.java:8973)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
)
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:335)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:199)
at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
... View more
Labels: