Member since
01-21-2019
13
Posts
1
Kudos Received
0
Solutions
02-20-2019
04:58 PM
ON the yarn queues tab, I see: root (100%), default(0%), and llap(100%)..does this may changes had no effect??
... View more
02-20-2019
04:52 PM
that seems to work no error in the log, the issue was that I had it specified in 'floating point'? Also now it my 'job' is accepted on the applications page, but it never starts.. There is one other job running on that queue (default), however when I try to kill it..it just errors out..with Error: Kill application failed!
... View more
02-20-2019
02:34 AM
I am trying to adjust the yarn capacity settings..for a newly setup hive/druid stack. The system was 'working' but then this week it started complaining about Failed to submit application_1550602511762_0005 to YARN : org.apache.hadoop.security.AccessControlException: Queue root.default already has 0 applications, cannot accept submission of application: application_1550602511762_0005
it was suggested I needed to reduce the capacity of the llap queue. my 'scheduling' config I am trying.. yarn.scheduler.capacity.default.minimum-user-limit-percent=100 yarn.scheduler.capacity.maximum-am-resource-percent=0.2 yarn.scheduler.capacity.maximum-applications=10000 yarn.scheduler.capacity.node-locality-delay=40 yarn.scheduler.capacity.resource-calculator=org.apache.hadoop.yarn.util.resource.DefaultResourceCalculator yarn.scheduler.capacity.root.accessible-node-labels=* yarn.scheduler.capacity.root.acl_administer_queue=* yarn.scheduler.capacity.root.acl_submit_applications=* yarn.scheduler.capacity.root.capacity=100 yarn.scheduler.capacity.root.default.acl_administer_jobs=* yarn.scheduler.capacity.root.default.acl_submit_applications=* yarn.scheduler.capacity.root.default.capacity=0.0 yarn.scheduler.capacity.root.default.maximum-capacity=0.0 yarn.scheduler.capacity.root.default.state=RUNNING yarn.scheduler.capacity.root.default.user-limit-factor=1 yarn.scheduler.capacity.root.llap.acl_administer_queue=hive yarn.scheduler.capacity.root.llap.acl_submit_applications=hive yarn.scheduler.capacity.root.llap.capacity=25.0 yarn.scheduler.capacity.root.llap.maximum-am-resource-percent=1 yarn.scheduler.capacity.root.llap.maximum-capacity=100.0 yarn.scheduler.capacity.root.llap.minimum-user-limit-percent=100 yarn.scheduler.capacity.root.llap.ordering-policy=fifo yarn.scheduler.capacity.root.llap.state=RUNNING yarn.scheduler.capacity.root.llap.user-limit-factor=1 yarn.scheduler.capacity.root.queues=default,llap yarn.scheduler.capacity.schedule-asynchronously.enable=true yarn.scheduler.capacity.schedule-asynchronously.maximum-threads=1 yarn.scheduler.capacity.schedule-asynchronously.scheduling-interval-ms=10 caused by: java.io.IOException: Failed to re-init queues : Illegal capacity of 0.25 for children of queue root at org.apache.hadoop.yarn.server.resourcemanager.scheduler.capacity.CapacityScheduler.reinitialize(CapacityScheduler.java:478)
at org.apache.hadoop.yarn.server.resourcemanager.AdminService.refreshQueues(AdminService.java:423)
at org.apache.hadoop.yarn.server.resourcemanager.AdminService.refreshQueues(AdminService.java:394)
.. 10 moreCaused by: java.lang.IllegalArgumentException: Illegal capacity of 0.25 for children of queue roots I have tried various configs..that add up to 100 or are less than 100, but it keeps complaining..
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache YARN
02-19-2019
05:24 PM
Any Suggestions on how to do that..(complete noob here), looking at: https://community.hortonworks.com/articles/149486/llap-sizing-and-setup.html, but I think that is previous version of HDP (im using 3.0)..not seeing what exactly what I need to adjust?
... View more
02-19-2019
12:52 AM
Hi all (new druid/hive setup on HDP, by a noob..), had this working last week, but today when I send a query via hive, I am getting the following error, any pointers greatly appreciated. Caused by: org.apache.hadoop.yarn.exceptions.YarnException: Failed to submit application_1550527619881_0001 to YARN : org.apache.hadoop.security.AccessControlException: Queue root.default already has 0 applications, cannot accept submission of application: application_1550527619881_0001 at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.submitApplication(YarnClientImpl.java:304) ~[hadoop-yarn-client-3.1.1.3.1.0.0-78.jar:?] at org.apache.tez.client.TezYarnClient.submitApplication(TezYarnClient.java:77) ~[tez-api-0.9.1.3.1.0.0-78.jar:0.9.1.3.1.0.0-78] at org.apache.tez.client.TezClient.start(TezClient.java:402) ~[tez-api-0.9.1.3.1.0.0-78.jar:0.9.1.3.1.0.0-78] ... 28 moreERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTaskINFO : Completed executing command(queryId=hive_20190218220946_044fa99b-a547-4e1c-9915-2b9a9c0cb17c); Time taken: 1.315 secondsError: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (state=08S01,code=1) Poking around, I see in the YARN-Management UI, I have 2 applications running for 6 days.. one is llap0 (yaarn-service) and 1 is hive-<UUID> Trying ti kill the hive app (from the UI), just gives: Error: Kill application failed! The llap0 service is shown to be at 100% but is still running..I haven't tried killing it yet..not sure I should..
... View more
Labels:
- Labels:
-
Apache Hive
02-14-2019
05:35 PM
Indeed thank you, that solved the issue, I was just surprised that I needed to do all that, thought ambari would have set that up from the beginning. A follow up question.. When I run the query on druid directly, it returns in like 2 seconds..when I run the query on hive..it converts to map-reduce and takes 2+ minutes to run..any thoughts on why? when running it through hive..I'm guessing the query is not being passed down..and rather all the data is being streamed back to hive?
... View more
02-14-2019
04:49 AM
Hi all, I have recently setup a new druid/hive stack (and yes I am new to both and to hadoop/tez, etc) (I have used the user admin accross the entire stack for the entire setup via HDP 3.0) I have been able to run some basic queries from hive without an issue, such as: select `__time`,accepteddate,accountid from sms_messages group by accountid, accepteddate, `__time` limit 100; works without issue, but if run (from squirrel) select max(`__time`),accepteddate,accountid from sms_messages group by accountid, accepteddate; (note: this same query works when run on druid directly) it fails with: Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask SQLState: 08S01 ErrorCode: 1 From beeline (we more details): WARN : The session: sessionId=d797cf5f-3b33-49bb-aed9-c57f0e6a30c0, queueName=null, user=admin, doAs=true, isOpen=false, isDefault=false has not been openedINFO : Subscribed to counters: [] for queryId: hive_20190213215439_c3cfa7e9-ea7c-4dc2-930b-dc8d1cee41fcINFO : Tez session hasn't been created yet. Opening sessionERROR : Failed to execute tez graph.org.apache.hadoop.security.AccessControlException: Permission denied: user=admin, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x FULL STACK DETAILS BELOW 0: jdbc:hive2://wdc-tst-bdrd-001.openmarket.c> select max(`__time`),accepteddate,accountid from sms_messages group by accountid, accepteddate;INFO : Compiling command(queryId=hive_20190213215031_4b3f0067-f772-4705-8f4e-4da82e1fa8f4): select max(`__time`),accepteddate,accountid from sms_messages group by accountid, accepteddateINFO : Semantic Analysis Completed (retrial = false)INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_c0, type:timestamp with local time zone, comment:null), FieldSchema(name:accepteddate, type:string, comment:null), FieldSchema(name:accountid, type:string, comment:null)], properties:null)INFO : Completed compiling command(queryId=hive_20190213215031_4b3f0067-f772-4705-8f4e-4da82e1fa8f4); Time taken: 0.119 secondsINFO : Executing command(queryId=hive_20190213215031_4b3f0067-f772-4705-8f4e-4da82e1fa8f4): select max(`__time`),accepteddate,accountid from sms_messages group by accountid, accepteddateINFO : Query ID = hive_20190213215031_4b3f0067-f772-4705-8f4e-4da82e1fa8f4INFO : Total jobs = 1INFO : Launching Job 1 out of 1INFO : Starting task [Stage-1:MAPRED] in serial modeWARN : The session: sessionId=3b91cc44-16a0-46c0-8c27-6d2b15697ba7, queueName=null, user=admin, doAs=true, isOpen=false, isDefault=false has not been openedINFO : Subscribed to counters: [] for queryId: hive_20190213215031_4b3f0067-f772-4705-8f4e-4da82e1fa8f4INFO : Tez session hasn't been created yet. Opening sessionERROR : Failed to execute tez graph.org.apache.hadoop.security.AccessControlException: Permission denied: user=admin, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1857) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1841) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1800) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112] at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2417) ~[hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2391) ~[hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1325) ~[hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1322) ~[hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1339) ~[hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1314) ~[hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:2275) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getDefaultDestDir(DagUtils.java:1001) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getHiveJarDirectory(DagUtils.java:1153) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.createJarLocalResource(TezSessionState.java:896) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.makeCombinedJarMap(TezSessionState.java:349) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:418) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolSession.openInternal(TezSessionPoolSession.java:124) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:373) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.tez.TezTask.ensureSessionHasResources(TezTask.java:372) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:199) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:210) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2711) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2382) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2054) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1752) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1746) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) ~[hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:226) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:324) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:342) ~[hive-service-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_112] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) ~[?:1.8.0_112] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) ~[?:1.8.0_112] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]Caused by: org.apache.hadoop.ipc.RemoteException: Permission denied: user=admin, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1857) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1841) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1800) at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1443) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1353) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.sun.proxy.$Proxy32.mkdirs(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:653) ~[hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:?] at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source) ~[?:?] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?] at com.sun.proxy.$Proxy33.mkdirs(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2415) ~[hadoop-hdfs-client-3.1.1.3.1.0.0-78.jar:?] ... 38 moreERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTaskINFO : Completed executing command(queryId=hive_20190213215031_4b3f0067-f772-4705-8f4e-4da82e1fa8f4); Time taken: 0.533 secondsError: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask (state=08S01,code=1)
... View more
Labels:
- Labels:
-
Apache Hive
02-09-2019
01:16 AM
1 Kudo
HI all, been trying to get Hive/Druid/Kafka working (noob to both hive/druid)..'getting close I think'
I have gotten to the point of being able load data via kafka->druid
I have been able to create 'external druid' tables in hive, and even 'query', such that I can get accurate row counts.
but when ever I actually query any columns..I get null back for all columns (except the __time). This entire stack was setup via: HDP 3, onto 3 boxes. CREATE EXTERNAL TABLE teststuff5 STORED BY 'org.apache.hadoop.hive.druid.DruidStorageHandler' TBLPROPERTIES ("druid.datasource" = "teststuff5"); Queries are coming back like (we see we have a non null time value..and for some reason one other column is non null): | teststuff5.__time | teststuff5.accepteddate | teststuff5.accountid | teststuff5.accountname | teststuff5.apiversion | teststuff5.carrierid | teststuff5.carriername | teststuff5.companyid | teststuff5.contentencoding | teststuff5.countrycode | teststuff5.countryname | teststuff5.delivereddate | teststuff5.destinationaddress | teststuff5.internalmessageid | teststuff5.liveoperatorlookup | teststuff5.messageid | teststuff5.messageoriginator | teststuff5.messageoriginatorton | teststuff5.messagestatus | teststuff5.messagetype | teststuff5.mtscore | teststuff5.parentmessageid | teststuff5.phonenumber | teststuff5.productid | teststuff5.productiddescription | teststuff5.productname | teststuff5.productsubtype | teststuff5.programid | teststuff5.remoteipaddress | teststuff5.remoteresponsecode | teststuff5.responsecode | teststuff5.responsecodedescription | teststuff5.sourceaddress | teststuff5.subaccount | teststuff5.updateddate | teststuff5.useragent | teststuff5.userdataheader | teststuff5.userdefined1 | teststuff5.userdefined2 |
+----------------------------+--------------------------+-----------------------+-------------------------+------------------------+-----------------------+-------------------------+-----------------------+-----------------------------+-------------------------+-------------------------+---------------------------+--------------------------------+-------------------------------+--------------------------------+-----------------------+-------------------------------+----------------------------------+---------------------------+-------------------------+---------------------+-----------------------------+-------------------------+-----------------------+----------------------------------+-------------------------+----------------------------+-----------------------+-----------------------------+--------------------------------+--------------------------+-------------------------------------+---------------------------+-------------------------------------+-------------------------+-----------------------+----------------------------+--------------------------+--------------------------+
| 2019-02-05 18:53:00.0 UTC | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | NULL | ESA_SMS_DEVICE_NTERNAL | NULL | NULL | NULL | NULL | NULL |
+----------------------------+--------------------------+-----------------------+-------------------------+------------------------+-----------------------+-------------------------+-----------------------+-----------------------------+-------------------------+-------------------------+---------------------------+--------------------------------+-------------------------------+--------------------------------+-----------------------+-------------------------------+----------------------------------+---------------------------+-------------------------+---------------------+-----------------------------+-------------------------+-----------------------+----------------------------------+-------------------------+----------------------------+-----------------------+-----------------------------+--------------------------------+--------------------------+-------------------------------------+---------------------------+-------------------------------------+-------------------------+-----------------------+----------------------------+--------------------------+--------------------------+ note: The same query on druid..shows all values.. Calling:
: jdbc:hive2://wdc-tst-bdrd-001.openmarket.c> describe formatted teststuff5; Gives: INFO : Compiling command(queryId=hive_20190207234008_c536632e-547f-4843-96bc-e92b8577799d): describe formatted teststuff5
INFO : Semantic Analysis Completed (retrial = false)
INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:col_name, type:string, comment:from deserializer), FieldSchema(name:data_type, type:string, comment:from deserializer), FieldSchema(name:comment, type:string, comment:from deserializer)], properties:null)
INFO : Completed compiling command(queryId=hive_20190207234008_c536632e-547f-4843-96bc-e92b8577799d); Time taken: 0.062 seconds
INFO : Executing command(queryId=hive_20190207234008_c536632e-547f-4843-96bc-e92b8577799d): describe formatted teststuff5
INFO : Starting task [Stage-0:DDL] in serial mode
INFO : Completed executing command(queryId=hive_20190207234008_c536632e-547f-4843-96bc-e92b8577799d); Time taken: 0.142 seconds
INFO : OK
+-------------------------------+----------------------------------------------------+----------------------------------------------------+
| col_name | data_type | comment |
+-------------------------------+----------------------------------------------------+----------------------------------------------------+
| # col_name | data_type | comment |
| __time | timestamp with local time zone | from deserializer |
| accepteddate | string | from deserializer |
| accountid | string | from deserializer |
| accountname | string | from deserializer |
| apiversion | string | from deserializer |
| carrierid | string | from deserializer |
| carriername | string | from deserializer |
| companyid | string | from deserializer |
| contentencoding | string | from deserializer |
| countrycode | string | from deserializer |
| countryname | string | from deserializer |
| delivereddate | string | from deserializer |
| destinationaddress | string | from deserializer |
| internalmessageid | string | from deserializer |
| liveoperatorlookup | string | from deserializer |
| messageid | string | from deserializer |
| messageoriginator | string | from deserializer |
| messageoriginatorton | string | from deserializer |
| messagestatus | string | from deserializer |
| messagetype | string | from deserializer |
| mtscore | string | from deserializer |
| parentmessageid | string | from deserializer |
| phonenumber | string | from deserializer |
| productid | string | from deserializer |
| productiddescription | string | from deserializer |
| productname | string | from deserializer |
| productsubtype | string | from deserializer |
| programid | string | from deserializer |
| remoteipaddress | string | from deserializer |
| remoteresponsecode | string | from deserializer |
| responsecode | string | from deserializer |
| responsecodedescription | string | from deserializer |
| sourceaddress | string | from deserializer |
| subaccount | string | from deserializer |
| updateddate | string | from deserializer |
| useragent | string | from deserializer |
| userdataheader | string | from deserializer |
| userdefined1 | string | from deserializer |
| userdefined2 | string | from deserializer |
| | NULL | NULL |
| # Detailed Table Information | NULL | NULL |
| Database: | default | NULL |
| OwnerType: | USER | NULL |
| Owner: | admin | NULL |
| CreateTime: | Thu Feb 07 23:06:21 UTC 2019 | NULL |
| LastAccessTime: | UNKNOWN | NULL |
| Retention: | 0 | NULL |
| Location: | hdfs://wdc-tst-bdrd-001.openmarket.com:8020/warehouse/tablespace/external/hive/teststuff5 | NULL |
| Table Type: | EXTERNAL_TABLE | NULL |
| Table Parameters: | NULL | NULL |
| | COLUMN_STATS_ACCURATE | {\"BASIC_STATS\":\"true\",\"COLUMN_STATS\":{\"__time\":\"true\",\"accepteddate\":\"true\",\"accountid\":\"true\",\"accountname\":\"true\",\"apiversion\":\"true\",\"carrierid\":\"true\",\"carriername\":\"true\",\"companyid\":\"true\",\"contentencoding\":\"true\",\"countrycode\":\"true\",\"countryname\":\"true\",\"delivereddate\":\"true\",\"destinationaddress\":\"true\",\"internalmessageid\":\"true\",\"liveoperatorlookup\":\"true\",\"messageid\":\"true\",\"messageoriginator\":\"true\",\"messageoriginatorton\":\"true\",\"messagestatus\":\"true\",\"messagetype\":\"true\",\"mtscore\":\"true\",\"parentmessageid\":\"true\",\"phonenumber\":\"true\",\"productid\":\"true\",\"productiddescription\":\"true\",\"productname\":\"true\",\"productsubtype\":\"true\",\"programid\":\"true\",\"remoteipaddress\":\"true\",\"remoteresponsecode\":\"true\",\"responsecode\":\"true\",\"responsecodedescription\":\"true\",\"sourceaddress\":\"true\",\"subaccount\":\"true\",\"updateddate\":\"true\",\"useragent\":\"true\",\"userdataheader\":\"true\",\"userdefined1\":\"true\",\"userdefined2\":\"true\"}} |
| | EXTERNAL | TRUE |
| | bucketing_version | 2 |
| | druid.datasource | teststuff5 |
| | numFiles | 0 |
| | numRows | 0 |
| | rawDataSize | 0 |
| | storage_handler | org.apache.hadoop.hive.druid.DruidStorageHandler |
| | totalSize | 0 |
| | transient_lastDdlTime | 1549580781 |
| | NULL | NULL |
| # Storage Information | NULL | NULL |
| SerDe Library: | org.apache.hadoop.hive.druid.serde.DruidSerDe | NULL |
| InputFormat: | null | NULL |
| OutputFormat: | null | NULL |
| Compressed: | No | NULL |
| Num Buckets: | -1 | NULL |
| Bucket Columns: | [] | NULL |
| Sort Columns: | [] | NULL |
| Storage Desc Params: | NULL | NULL |
| | serialization.format | 1 |
+-------------------------------+----------------------------------------------------+----------------------------------------------------+ hiveserver2 logging sample 2019-02-08T18:45:21,640 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:21,640 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:21,642 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: operation.OperationManager (:()) - Adding operation: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=3b771b6b-7ff7-49dc-b212-dfc6b77211dc]<br>2019-02-08T18:45:21,646 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ql.Driver (:()) - Compiling command(queryId=hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d): select * from teststuff5 limit 1<br>2019-02-08T18:45:21,706 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: lockmgr.DbTxnManager (:()) - Opened txnid:185<br>2019-02-08T18:45:21,707 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Starting Semantic Analysis<br>2019-02-08T18:45:21,708 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Completed phase 1 of Semantic Analysis<br>2019-02-08T18:45:21,708 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Get metadata for source tables<br>2019-02-08T18:45:21,733 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Get metadata for subqueries<br>2019-02-08T18:45:21,733 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Get metadata for destination tables<br>2019-02-08T18:45:21,745 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ql.Context (:()) - New scratch dir is hdfs://wdc-tst-bdrd-001:8020/tmp/hive/admin/8dea5cfb-f504-4237-a3b4-fb19d739b792/hive_2019-02-08_18-45-21_696_7674740746834465900-2<br>2019-02-08T18:45:21,745 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Completed getting MetaData in Semantic Analysis<br>2019-02-08T18:45:21,896 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Get metadata for source tables<br>2019-02-08T18:45:21,907 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Get metadata for subqueries<br>2019-02-08T18:45:21,907 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Get metadata for destination tables<br>2019-02-08T18:45:21,908 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ql.Context (:()) - New scratch dir is hdfs://wdc-tst-bdrd-001:8020/tmp/hive/admin/8dea5cfb-f504-4237-a3b4-fb19d739b792/hive_2019-02-08_18-45-21_696_7674740746834465900-2<br>2019-02-08T18:45:21,912 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: common.FileUtils (FileUtils.java:mkdir(580)) - Creating directory if it doesn't exist: hdfs://wdc-tst-bdrd-001:8020/tmp/hive/admin/8dea5cfb-f504-4237-a3b4-fb19d739b792/hive_2019-02-08_18-45-21_696_7674740746834465900-2/-mr-10001/.hive-staging_hive_2019-02-08_18-45-21_696_7674740746834465900-2<br>2019-02-08T18:45:21,917 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - CBO Succeeded; optimized logical plan.<br>2019-02-08T18:45:21,918 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ppd.OpProcFactory (:()) - Processing for FS(2)<br>2019-02-08T18:45:21,918 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ppd.OpProcFactory (:()) - Processing for SEL(1)<br>2019-02-08T18:45:21,918 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ppd.OpProcFactory (:()) - Processing for TS(0)<br>2019-02-08T18:45:21,921 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: parse.CalcitePlanner (:()) - Completed plan generation<br>2019-02-08T18:45:21,921 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ql.Driver (:()) - Semantic Analysis Completed (retrial = false)<br>2019-02-08T18:45:21,921 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ql.Driver (:()) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:teststuff5.__time, type:timestamp with local time zone, comment:null), FieldSchema(name:teststuff5.accepteddate, type:string, comment:null), FieldSchema(name:teststuff5.accountid, type:string, comment:null), FieldSchema(name:teststuff5.accountname, type:string, comment:null), FieldSchema(name:teststuff5.apiversion, type:string, comment:null), FieldSchema(name:teststuff5.carrierid, type:string, comment:null), FieldSchema(name:teststuff5.carriername, type:string, comment:null), FieldSchema(name:teststuff5.companyid, type:string, comment:null), FieldSchema(name:teststuff5.contentencoding, type:string, comment:null), FieldSchema(name:teststuff5.countrycode, type:string, comment:null), FieldSchema(name:teststuff5.countryname, type:string, comment:null), FieldSchema(name:teststuff5.delivereddate, type:string, comment:null), FieldSchema(name:teststuff5.destinationaddress, type:string, comment:null), FieldSchema(name:teststuff5.internalmessageid, type:string, comment:null), FieldSchema(name:teststuff5.liveoperatorlookup, type:string, comment:null), FieldSchema(name:teststuff5.messageid, type:string, comment:null), FieldSchema(name:teststuff5.messageoriginator, type:string, comment:null), FieldSchema(name:teststuff5.messageoriginatorton, type:string, comment:null), FieldSchema(name:teststuff5.messagestatus, type:string, comment:null), FieldSchema(name:teststuff5.messagetype, type:string, comment:null), FieldSchema(name:teststuff5.mtscore, type:string, comment:null), FieldSchema(name:teststuff5.parentmessageid, type:string, comment:null), FieldSchema(name:teststuff5.phonenumber, type:string, comment:null), FieldSchema(name:teststuff5.productid, type:string, comment:null), FieldSchema(name:teststuff5.productiddescription, type:string, comment:null), FieldSchema(name:teststuff5.productname, type:string, comment:null), FieldSchema(name:teststuff5.productsubtype, type:string, comment:null), FieldSchema(name:teststuff5.programid, type:string, comment:null), FieldSchema(name:teststuff5.remoteipaddress, type:string, comment:null), FieldSchema(name:teststuff5.remoteresponsecode, type:string, comment:null), FieldSchema(name:teststuff5.responsecode, type:string, comment:null), FieldSchema(name:teststuff5.responsecodedescription, type:string, comment:null), FieldSchema(name:teststuff5.sourceaddress, type:string, comment:null), FieldSchema(name:teststuff5.subaccount, type:string, comment:null), FieldSchema(name:teststuff5.updateddate, type:string, comment:null), FieldSchema(name:teststuff5.useragent, type:string, comment:null), FieldSchema(name:teststuff5.userdataheader, type:string, comment:null), FieldSchema(name:teststuff5.userdefined1, type:string, comment:null), FieldSchema(name:teststuff5.userdefined2, type:string, comment:null)], properties:null)<br>2019-02-08T18:45:21,923 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: exec.TableScanOperator (:()) - Initializing operator TS[0]<br>2019-02-08T18:45:21,924 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: exec.SelectOperator (:()) - Initializing operator SEL[1]<br>2019-02-08T18:45:21,924 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: exec.SelectOperator (:()) - SELECT struct<__time:timestamp with local time zone,accepteddate:string,accountid:string,accountname:string,apiversion:string,carrierid:string,carriername:string,companyid:string,contentencoding:string,countrycode:string,countryname:string,delivereddate:string,destinationaddress:string,internalmessageid:string,liveoperatorlookup:string,messageid:string,messageoriginator:string,messageoriginatorton:string,messagestatus:string,messagetype:string,mtscore:string,parentmessageid:string,phonenumber:string,productid:string,productiddescription:string,productname:string,productsubtype:string,programid:string,remoteipaddress:string,remoteresponsecode:string,responsecode:string,responsecodedescription:string,sourceaddress:string,subaccount:string,updateddate:string,useragent:string,userdataheader:string,userdefined1:string,userdefined2:string><br>2019-02-08T18:45:21,924 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: exec.ListSinkOperator (:()) - Initializing operator LIST_SINK[3]<br>2019-02-08T18:45:21,925 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: ql.Driver (:()) - Completed compiling command(queryId=hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d); Time taken: 0.279 seconds<br>2019-02-08T18:45:21,925 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:21,927 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:21,931 INFO [HiveServer2-Background-Pool: Thread-4925]: reexec.ReExecDriver (:()) - Execution #1 of query<br>2019-02-08T18:45:21,931 INFO [HiveServer2-Background-Pool: Thread-4925]: lockmgr.DbTxnManager (:()) - Setting lock request transaction to txnid:185 for queryId=hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d<br>2019-02-08T18:45:22,009 INFO [HiveServer2-Background-Pool: Thread-4925]: ql.Driver (:()) - Executing command(queryId=hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d): select * from teststuff5 limit 1<br>2019-02-08T18:45:22,010 INFO [HiveServer2-Background-Pool: Thread-4925]: hooks.HiveProtoLoggingHook (:()) - Received pre-hook notification for: hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d<br>2019-02-08T18:45:22,016 INFO [HiveServer2-Background-Pool: Thread-4925]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,023 INFO [HiveServer2-Background-Pool: Thread-4925]: hooks.HiveProtoLoggingHook (:()) - Received post-hook notification for: hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d<br>2019-02-08T18:45:22,023 INFO [HiveServer2-Background-Pool: Thread-4925]: ql.Driver (:()) - Completed executing command(queryId=hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d); Time taken: 0.014 seconds<br>2019-02-08T18:45:22,023 INFO [HiveServer2-Background-Pool: Thread-4925]: ql.Driver (:()) - OK<br>2019-02-08T18:45:22,023 INFO [HiveServer2-Background-Pool: Thread-4925]: lockmgr.DbTxnManager (:()) - Stopped heartbeat for query: hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d<br>2019-02-08T18:45:22,064 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,064 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,064 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,065 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,152 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,152 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,152 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,152 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,177 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,177 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,177 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,177 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,182 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,182 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,186 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: pool.ChannelResourceFactory (:()) - Generating: http://wdc-tst-bdrd-001:8082<br>2019-02-08T18:45:22,211 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,210 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: exec.TableScanOperator (:()) - RECORDS_OUT_INTERMEDIATE:0, RECORDS_OUT_OPERATOR_TS_0:1, <br>2019-02-08T18:45:22,210 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: exec.SelectOperator (:()) - RECORDS_OUT_INTERMEDIATE:0, RECORDS_OUT_OPERATOR_SEL_1:1, <br>2019-02-08T18:45:22,210 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: exec.ListSinkOperator (:()) - RECORDS_OUT_INTERMEDIATE:0, RECORDS_OUT_OPERATOR_LIST_SINK_3:1, <br>2019-02-08T18:45:22,211 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,235 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,235 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,235 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,235 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,238 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,238 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,238 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,239 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,240 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,241 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,241 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,241 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,245 INFO [HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,245 INFO [HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Updating thread name to 8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127<br>2019-02-08T18:45:22,245 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: operation.OperationManager (:()) - Closing operation: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=3b771b6b-7ff7-49dc-b212-dfc6b77211dc]<br>2019-02-08T18:45:22,245 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: operation.OperationManager (:()) - Removed queryId: hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d corresponding to operation: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=3b771b6b-7ff7-49dc-b212-dfc6b77211dc] with tag: null<br>2019-02-08T18:45:22,262 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: operation.Operation (:()) - Closing operation log /tmp/hive/operation_logs/8dea5cfb-f504-4237-a3b4-fb19d739b792/hive_20190208184521_4a573dee-6487-4e4e-a72f-83b343aed10d without delay<br>2019-02-08T18:45:22,262 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: conf.HiveConf (HiveConf.java:getLogIdVar(5130)) - Using the default value passed in for log id: 8dea5cfb-f504-4237-a3b4-fb19d739b792<br>2019-02-08T18:45:22,262 INFO [8dea5cfb-f504-4237-a3b4-fb19d739b792 HiveServer2-Handler-Pool: Thread-127]: session.SessionState (:()) - Resetting thread name to HiveServer2-Handler-Pool: Thread-127
... View more
Labels:
- Labels:
-
Apache Hive
01-22-2019
04:50 PM
Indeed, user error on posting my first question here! I believe there is another question with actual details..
... View more