Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Can not create & insert into druid using HiveQL command

Explorer

Hi,

I have run one simple SQL command in order to create & insert data into druid but error always happend with following details:

Even the map-reduce run well.

Note that all hive & druid configuration have been set out correctly.

Someone help me on this case because I'v searched & worked for this error for full 2 days ago but still fail.

Thanks

==============================================================================================

2018-11-08T08:48:28,101 DEBUG [main-SendThread(192.168.1.112:2181)]: zookeeper.ClientCnxn (:()) - Got ping response for sessionid: 0x166e8e0c7930f25 after 0ms 2018-11-08T08:48:28,204 DEBUG [org.apache.ranger.audit.queue.AuditBatchQueue0-SendThread(hdpdev:2181)]: zookeeper.ClientCnxn (:()) - Got ping response for sessionid: 0x166e8e0c7930f32 after 0ms 2018-11-08T08:48:29,300 WARN [HiveServer2-Background-Pool: Thread-3016]: common.RetryUtils (:()) - Failed on try 7, retrying in 58,513ms. org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException: java.sql.SQLException: Cannot create JDBC driver of class 'com.mysql.jdbc.Driver' for connect URL '' at org.skife.jdbi.v2.DBI.open(DBI.java:230) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.DBI.withHandle(DBI.java:279) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.druid.io.druid.metadata.SQLMetadataConnector$2.call(SQLMetadataConnector.java:135) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.druid.io.druid.java.util.common.RetryUtils.retry(RetryUtils.java:63) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.druid.io.druid.java.util.common.RetryUtils.retry(RetryUtils.java:81) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.druid.io.druid.metadata.SQLMetadataConnector.retryWithHandle(SQLMetadataConnector.java:139) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.druid.io.druid.metadata.SQLMetadataConnector.retryWithHandle(SQLMetadataConnector.java:148) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.druid.io.druid.metadata.SQLMetadataConnector.createTable(SQLMetadataConnector.java:189) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.druid.io.druid.metadata.SQLMetadataConnector.createSegmentTable(SQLMetadataConnector.java:261) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.druid.io.druid.metadata.SQLMetadataConnector.createSegmentTable(SQLMetadataConnector.java:545) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.druid.DruidStorageHandler.preCreateTable(DruidStorageHandler.java:235) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:897) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:887) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:212) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at com.sun.proxy.$Proxy54.createTable(Unknown Source) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2934) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at com.sun.proxy.$Proxy54.createTable(Unknown Source) ~[?:?] at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:1001) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:1017) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4964) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:395) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:210) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2701) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2372) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2048) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1746) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1740) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:318) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:331) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] Caused by: java.sql.SQLException: Cannot create JDBC driver of class 'com.mysql.jdbc.Driver' for connect URL '' at org.apache.commons.dbcp2.BasicDataSource.createConnectionFactory(BasicDataSource.java:2023) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:1897) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1413) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.DataSourceConnectionFactory.openConnection(DataSourceConnectionFactory.java:36) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.DBI.open(DBI.java:212) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] ... 50 more Caused by: java.sql.SQLException: No suitable driver at org.apache.commons.dbcp2.BasicDataSource.createConnectionFactory(BasicDataSource.java:2014) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:1897) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:1413) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.DataSourceConnectionFactory.openConnection(DataSourceConnectionFactory.java:36) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.DBI.open(DBI.java:212) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] ... 50 more

9 REPLIES 9

Explorer

hive QL command like is "0: jdbc:hive2://192.168.1.112:2181/default> CREATE TABLE nq_druid_hiv...rp_tonghop_hoso1 (Stage-1) DEBUG : DagInfo: {"context":"Hive","description":"CREATE TABLE nq_druid_hive STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES (\"druid.segment.granularity\" = \"DAY\",\"druid.query.granularity\" = \"DAY\") AS SELECT cast(ngay as timestamp) as `__time`,cast(trangthai as string) c_donvi,cast(donvi as string) c_donvi,cast(linhvuc as string) c_linhvuc, soluong FROM rp_tonghop_hoso1"}"

ERROR:

Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.skife.jdbi.v2.exceptions.UnableToObtainConnectionException: java.sql.SQLException: Cannot create JDBC driver of class 'com.mysql.jdbc.Driver' for connect URL '' (state=08S01,code=1)

Explorer

Hi all,

after setting some var env such as:

1. set hive.druid.metadata.uri=jdbc:mysql://hdpdev:3306/druid;

2. set hive.llap.execution.mode=none;

3. CREATE EXTERNAL TABLE nq_druid_hive STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ("druid.datasource"="dummy","druid.segment.granularity" = "MONTH","druid.query.granularity" = "DAY") AS SELECT cast(ngay as timestamp) as `__time`,cast(trangthai as string) c_donvi,cast(donvi as string) c_donvi,cast(linhvuc as string) c_linhvuc, soluong FROM rp_tonghop_hoso1;

then other error come:

===========================================

2018-11-08T16:57:06,211 DEBUG [NotificationEventPoll 0]: events.NotificationEventPoll (:()) - Event: NotificationEvent(eventId:8592, eventTime:1541671025, eventType:CREATE_TABLE, dbName:ngoquyen, tableName:nq_druid_hive, message:{"server":"thrift://192.168.1.112:9083","servicePrincipal":"hive/_HOST@EXAMPLE.COM","db":"ngoquyen","table":"nq_druid_hive","tableType":"EXTERNAL_TABLE","tableObjJson":"{\"1\":{\"str\":\"nq_druid_hive\"},\"2\":{\"str\":\"ngoquyen\"},\"3\":{\"str\":\"hive\"},\"4\":{\"i32\":1541671025},\"5\":{\"i32\":0},\"6\":{\"i32\":0},\"7\":{\"rec\":{\"1\":{\"lst\":[\"rec\",5,{\"1\":{\"str\":\"__time\"},\"2\":{\"str\":\"timestamp\"}},{\"1\":{\"str\":\"c_donvi\"},\"2\":{\"str\":\"string\"}},{\"1\":{\"str\":\"c_donvi_1\"},\"2\":{\"str\":\"string\"}},{\"1\":{\"str\":\"c_linhvuc\"},\"2\":{\"str\":\"string\"}},{\"1\":{\"str\":\"soluong\"},\"2\":{\"str\":\"int\"}}]},\"2\":{\"str\":\"hdfs://hdpdev:8020/warehouse/tablespace/external/hive/ngoquyen.db/nq_druid_hive\"},\"5\":{\"tf\":0},\"6\":{\"i32\":-1},\"7\":{\"rec\":{\"2\":{\"str\":\"org.apache.hadoop.hive.druid.serde.DruidSerDe\"},\"3\":{\"map\":[\"str\",\"str\",1,{\"serialization.format\":\"1\"}]}}},\"8\":{\"lst\":[\"str\",0]},\"9\":{\"lst\":[\"rec\",0]},\"10\":{\"map\":[\"str\",\"str\",0,{}]},\"11\":{\"rec\":{\"1\":{\"lst\":[\"str\",0]},\"2\":{\"lst\":[\"lst\",0]},\"3\":{\"map\":[\"lst\",\"str\",0,{}]}}},\"12\":{\"tf\":0}}},\"8\":{\"lst\":[\"rec\",0]},\"9\":{\"map\":[\"str\",\"str\",9,{\"druid.segment.granularity\":\"MONTH\",\"totalSize\":\"0\",\"EXTERNAL\":\"TRUE\",\"numFiles\":\"0\",\"transient_lastDdlTime\":\"1541671025\",\"bucketing_version\":\"2\",\"druid.datasource\":\"dummy\",\"druid.query.granularity\":\"DAY\",\"storage_handler\":\"org.apache.hadoop.hive.druid.DruidStorageHandler\"}]},\"12\":{\"str\":\"EXTERNAL_TABLE\"},\"13\":{\"rec\":{\"1\":{\"map\":[\"str\",\"lst\",0,{}]}}},\"14\":{\"tf\":0},\"17\":{\"str\":\"hive\"},\"18\":{\"i32\":1}}","timestamp":1541671025,"files":[]}, messageFormat:json-0.2, catName:hive) 2018-11-08T16:57:06,216 DEBUG [NotificationEventPoll 0]: metastore.HiveMetaStoreClient (:()) - Got back 0 events 2018-11-08T16:57:06,216 DEBUG [NotificationEventPoll 0]: events.NotificationEventPoll (:()) - Processed 7 notification events 2018-11-08T16:57:06,223 DEBUG [HiveServer2-Background-Pool: Thread-212]: hdfs.DFSClient (:()) - /druid/segments/dummy/20180901T000000.000Z_20181001T000000.000Z/2018-11-08T16_56_41.600+07_00: masked={ masked: rwxr-xr-x, unmasked: rwxrwxrwx } 2018-11-08T16:57:06,223 DEBUG [IPC Parameter Sending Thread #3]: ipc.Client (:()) - IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive sending #694 org.apache.hadoop.hdfs.protocol.ClientProtocol.mkdirs 2018-11-08T16:57:06,250 DEBUG [IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive]: ipc.Client (:()) - IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive got value #694 2018-11-08T16:57:06,316 DEBUG [HiveServer2-Background-Pool: Thread-212]: retry.RetryInvocationHandler (:()) - Exception while invoking call #694 ClientNamenodeProtocolTranslatorPB.mkdirs over null. Not retrying because try once and fail. org.apache.hadoop.ipc.RemoteException: Permission denied: user=hive, access=WRITE, inode="/":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1850) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1834) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1793) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1443) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1353) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at com.sun.proxy.$Proxy29.mkdirs(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:653) ~[hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:?] at sun.reflect.GeneratedMethodAccessor26.invoke(Unknown Source) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at com.sun.proxy.$Proxy30.mkdirs(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2409) ~[hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2385) ~[hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1325) ~[hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1322) ~[hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1339) ~[hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1314) ~[hadoop-hdfs-client-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2326) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.publishSegmentWithShardSpec(DruidStorageHandlerUtils.java:777) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.lambda$publishSegmentsAndCommit$6(DruidStorageHandlerUtils.java:541) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.tweak.transactions.LocalTransactionHandler.inTransaction(LocalTransactionHandler.java:184) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.BasicHandle.inTransaction(BasicHandle.java:327) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.DBI$5.withHandle(DBI.java:333) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.DBI.withHandle(DBI.java:281) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.skife.jdbi.v2.DBI.inTransaction(DBI.java:329) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.publishSegmentsAndCommit(DruidStorageHandlerUtils.java:469) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.druid.DruidStorageHandler.loadAndCommitDruidSegments(DruidStorageHandler.java:602) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.druid.DruidStorageHandler.commitInsertTable(DruidStorageHandler.java:824) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.druid.DruidStorageHandler.commitCreateTable(DruidStorageHandler.java:261) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:904) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:887) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:212) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at com.sun.proxy.$Proxy54.createTable(Unknown Source) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2934) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at com.sun.proxy.$Proxy54.createTable(Unknown Source) ~[?:?] at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:1001) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:1017) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4964) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:395) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:210) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2701) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2372) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2048) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1746) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1740) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:318) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:331) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] 2018-11-08T16:57:06,352 DEBUG [IPC Parameter Sending Thread #3]: ipc.Client (:()) - IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive sending #695 org.apache.hadoop.hdfs.protocol.ClientProtocol.delete 2018-11-08T16:57:06,353 DEBUG [IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive]: ipc.Client (:()) - IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive got value #695 2018-11-08T16:57:06,353 DEBUG [HiveServer2-Background-Pool: Thread-212]: ipc.ProtobufRpcEngine (:()) - Call: delete took 2ms 2018-11-08T16:57:06,353 DEBUG [IPC Parameter Sending Thread #3]: ipc.Client (:()) - IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive sending #696 org.apache.hadoop.hdfs.protocol.ClientProtocol.delete 2018-11-08T16:57:06,354 DEBUG [IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive]: ipc.Client (:()) - IPC Client (731610911) connection to hdpdev/192.168.1.112:8020 from hive got value #696 2018-11-08T16:57:06,354 DEBUG [HiveServer2-Background-Pool: Thread-212]: ipc.ProtobufRpcEngine (:()) - Call: delete took 1ms 2018-11-08T16:57:06,355 ERROR [HiveServer2-Background-Pool: Thread-212]: exec.DDLTask (:()) - Failed org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Transaction failed do to exception being thrown from within the callback. See cause for the original exception.) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:1012) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:1017) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4964) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:395) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:210) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2701) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2372) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2048) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1746) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1740) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:318) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) ~[hadoop-common-3.1.1.3.0.1.0-187.jar:?] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:331) ~[hive-service-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112] Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Transaction failed do to exception being thrown from within the callback. See cause for the original exception. at org.apache.hadoop.hive.druid.DruidStorageHandler.commitInsertTable(DruidStorageHandler.java:829) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.druid.DruidStorageHandler.commitCreateTable(DruidStorageHandler.java:261) ~[hive-druid-handler-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:904) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:887) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:212) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at com.sun.proxy.$Proxy54.createTable(Unknown Source) ~[?:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2934) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] at com.sun.proxy.$Proxy54.createTable(Unknown Source) ~[?:?] at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:1001) ~[hive-exec-3.1.0.3.0.1.0-187.jar:3.1.0.3.0.1.0-187] ... 25 more

===========================================

Explorer

Hi all,

Another issue come out after setting some conf in Hive session. 😞

Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:1, Vertex vertex_1541674887397_0011_2_01 [Reducer 2] killed/failed due to:OWN_TASK_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 (state=08S01,code=2)

Thanks

HPM

Explorer

Hi,

I did some solutions from related links below but error still happening for simple create command:

https://community.hortonworks.com/questions/90648/hive-error-vertex-failed.html

http://www.hadoopadmin.co.in/hive/tez-job-fails-with-vertex-failure-error/

https://blog.csdn.net/helloxiaozhe/article/details/79710707

One suggestion gave out with HDFS cleaning but fail

Can pros help me on this issues?

Thanks

HPM

Explorer

Hi

Debuging every steps in Hive log, I found that one error happen before Vertex issuse:

2018-11-09T10:34:24,876 DEBUG [HiveServer2-Background-Pool: Thread-3717]: ipc.ProtobufRpcEngine (:()) - Call: getVertexStatus took 2ms 2018-11-09T10:34:24,876 INFO [HiveServer2-Background-Pool: Thread-3717]: monitoring.RenderStrategy$LogToFileFunction (:()) - Map 1: 1/1Reducer 2: 0(+0,-4)/2 2018-11-09T10:34:24,876 ERROR [HiveServer2-Background-Pool: Thread-3717]: SessionState (:()) - Status: Failed 2018-11-09T10:34:24,876 ERROR [HiveServer2-Background-Pool: Thread-3717]: SessionState (:()) - Vertex failed, vertexName=Reducer 2, vertexId=vertex

BR

HPM

Explorer

2018-11-09T11:36:00,309 DEBUG [HiveServer2-Background-Pool: Thread-196]: ipc.ProtobufRpcEngine (:()) - Call: getVertexStatus took 1ms 2018-11-09T11:36:00,309 DEBUG [HiveServer2-Background-Pool: Thread-196]: rpc.DAGClientRPCImpl (:()) - GetVertexStatus via AM for app: application_1541737770736_0001 dag: dag_1541737770736_0001_2 vertex: Reducer 2 2018-11-09T11:36:00,309 DEBUG [IPC Parameter Sending Thread #0]: ipc.Client (:()) - IPC Client (425275537) connection to hdpdev:37675 from hive sending #1684 org.apache.tez.dag.api.client.rpc.DAGClientAMProtocolBlockingPB.getVertexStatus 2018-11-09T11:36:00,309 DEBUG [IPC Client (425275537) connection to hdpdev:37675 from hive]: ipc.Client (:()) - IPC Client (425275537) connection to hdpdev:37675 from hive got value #1684 2018-11-09T11:36:00,309 DEBUG [HiveServer2-Background-Pool: Thread-196]: ipc.ProtobufRpcEngine (:()) - Call: getVertexStatus took 0ms 2018-11-09T11:36:00,310 INFO [HiveServer2-Background-Pool: Thread-196]: monitoring.RenderStrategy$LogToFileFunction (:()) - Map 1: 1/1Reducer 2: 0(+0,-10)/2 2018-11-09T11:36:00,310 ERROR [HiveServer2-Background-Pool: Thread-196]: SessionState (:()) - Status: Failed 2018-11-09T11:36:00,310 ERROR [HiveServer2-Background-Pool: Thread-196]: SessionState (:()) - Vertex failed, vertexName=Reducer 2, vertexId=vertex_1541737770736_0001_2_01, diagnostics=[Task failed, taskId=task_1541737770736_0001_2_01_000000, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1541737770736_0001_2_01_000000_0:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing vector batch (tag=0) (vectorizedVertexNum 1)

Explorer

Hi all,

I just share with you all about that, this issue have been fixed after all long 3 days to try & to setup all memory configuration.

Thanks

HPM

New Contributor

@hung pham

Hi can you please share the exact steps you performed to solve this issue.

Explorer

Hi,

I did not remember in details now cause it was 3 months ago. I may let you know more if you asked me sooner.

but for sure it was memory configuration.

BR

HPM

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.