2016-12-15 16:18:05,902 INFO exec.ListSinkOperator (Operator.java:close(612)) - 98 finished. closing... 2016-12-15 16:18:05,902 INFO exec.ListSinkOperator (Operator.java:close(634)) - 98 Close done 2016-12-15 16:18:05,903 WARN ql.Driver (DriverContext.java:shutdown(137)) - Shutting down task : Stage-1:MAPRED 2016-12-15 16:18:06,109 INFO impl.YarnClientImpl (YarnClientImpl.java:killApplication(395)) - Killed application application_1481788223653_0003 2016-12-15 16:18:06,117 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:06,117 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:06,871 WARN thrift.ThriftCLIService (ThriftCLIService.java:GetOperationStatus(621)) - Error getting operation status: org.apache.hive.service.cli.HiveSQLException: Invalid OperationHandle: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=1112a67b-1817-4720-bf04-40f60e456d51] at org.apache.hive.service.cli.operation.OperationManager.getOperation(OperationManager.java:151) at org.apache.hive.service.cli.CLIService.getOperationStatus(CLIService.java:375) at org.apache.hive.service.cli.thrift.ThriftCLIService.GetOperationStatus(ThriftCLIService.java:610) at org.apache.hive.service.cli.thrift.TCLIService$Processor$GetOperationStatus.getResult(TCLIService.java:1473) at org.apache.hive.service.cli.thrift.TCLIService$Processor$GetOperationStatus.getResult(TCLIService.java:1458) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-12-15 16:18:06,874 WARN thrift.ThriftCLIService (ThriftCLIService.java:FetchResults(681)) - Error fetching results: org.apache.hive.service.cli.HiveSQLException: Invalid OperationHandle: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=1112a67b-1817-4720-bf04-40f60e456d51] at org.apache.hive.service.cli.operation.OperationManager.getOperation(OperationManager.java:151) at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:454) at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:672) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1553) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1538) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-12-15 16:18:06,948 INFO exec.Task (SessionState.java:printInfo(954)) - Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0 2016-12-15 16:18:06,970 WARN mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 2016-12-15 16:18:06,970 INFO exec.Task (SessionState.java:printInfo(954)) - 2016-12-15 16:18:06,970 Stage-1 map = 0%, reduce = 0% 2016-12-15 16:18:06,974 WARN mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 2016-12-15 16:18:06,975 ERROR exec.Task (SessionState.java:printError(963)) - Ended Job = job_1481788223653_0003 with errors 2016-12-15 16:18:06,976 ERROR exec.Task (SessionState.java:printError(963)) - Error during job, obtaining debugging information... 2016-12-15 16:18:06,980 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:06,980 ERROR ql.Driver (SessionState.java:printError(963)) - FAILED: Operation cancelled 2016-12-15 16:18:06,980 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:06,980 INFO ql.Driver (SessionState.java:printInfo(954)) - MapReduce Jobs Launched: 2016-12-15 16:18:06,980 WARN mapreduce.Counters (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 2016-12-15 16:18:06,980 INFO ql.Driver (SessionState.java:printInfo(954)) - Stage-Stage-1: HDFS Read: 0 HDFS Write: 0 FAIL 2016-12-15 16:18:06,980 INFO ql.Driver (SessionState.java:printInfo(954)) - Total MapReduce CPU Time Spent: 0 msec 2016-12-15 16:18:06,981 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:06,981 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,560 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,560 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,561 INFO parse.ParseDriver (ParseDriver.java:parse(185)) - Parsing command: use bigsql 2016-12-15 16:18:13,561 INFO parse.ParseDriver (ParseDriver.java:parse(209)) - Parse Completed 2016-12-15 16:18:13,561 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,561 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,561 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(746)) - 1: get_database: bigsql 2016-12-15 16:18:13,562 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(371)) - ugi=admin ip=unknown-ip-addr cmd=get_database: bigsql 2016-12-15 16:18:13,568 INFO ql.Driver (Driver.java:compile(436)) - Semantic Analysis Completed 2016-12-15 16:18:13,568 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,568 INFO ql.Driver (Driver.java:getSchema(240)) - Returning Hive schema: Schema(fieldSchemas:null, properties:null) 2016-12-15 16:18:13,568 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,568 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,568 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,569 INFO ql.Driver (Driver.java:checkConcurrency(160)) - Concurrency mode is disabled, not creating a lock manager 2016-12-15 16:18:13,569 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,569 INFO ql.Driver (Driver.java:execute(1328)) - Starting command(queryId=hive_20161215161813_dfb08711-d2e4-40bc-81f5-7d8280644944): use bigsql 2016-12-15 16:18:13,569 INFO hooks.ATSHook (ATSHook.java:(84)) - Created ATS Hook 2016-12-15 16:18:13,569 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,569 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,570 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,570 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,570 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,570 INFO ql.Driver (Driver.java:launchTask(1651)) - Starting task [Stage-0:DDL] in serial mode 2016-12-15 16:18:13,570 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(746)) - 1: get_database: bigsql 2016-12-15 16:18:13,570 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(371)) - ugi=admin ip=unknown-ip-addr cmd=get_database: bigsql 2016-12-15 16:18:13,575 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(746)) - 1: get_database: bigsql 2016-12-15 16:18:13,575 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(371)) - ugi=admin ip=unknown-ip-addr cmd=get_database: bigsql 2016-12-15 16:18:13,580 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,580 INFO hooks.ATSHook (ATSHook.java:(84)) - Created ATS Hook 2016-12-15 16:18:13,580 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,581 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,581 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,581 INFO ql.Driver (SessionState.java:printInfo(954)) - OK 2016-12-15 16:18:13,581 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,581 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,581 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,587 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,588 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,588 INFO parse.ParseDriver (ParseDriver.java:parse(185)) - Parsing command: insert into drivers(driverid,name,ssn,location,certified,wageplan) values(1,'test',234555,'test','test','test') 2016-12-15 16:18:13,589 INFO parse.ParseDriver (ParseDriver.java:parse(209)) - Parse Completed 2016-12-15 16:18:13,589 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,589 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,589 INFO parse.CalcitePlanner (SemanticAnalyzer.java:analyzeInternal(10127)) - Starting Semantic Analysis 2016-12-15 16:18:13,590 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(746)) - 1: get_table : db=bigsql tbl=drivers 2016-12-15 16:18:13,590 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(371)) - ugi=admin ip=unknown-ip-addr cmd=get_table : db=bigsql tbl=drivers 2016-12-15 16:18:13,626 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(746)) - 1: get_table : db=bigsql tbl=drivers 2016-12-15 16:18:13,626 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(371)) - ugi=admin ip=unknown-ip-addr cmd=get_table : db=bigsql tbl=drivers 2016-12-15 16:18:13,644 INFO parse.CalcitePlanner (SemanticAnalyzer.java:genResolvedParseTree(10074)) - Completed phase 1 of Semantic Analysis 2016-12-15 16:18:13,645 INFO parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(1552)) - Get metadata for source tables 2016-12-15 16:18:13,645 INFO parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(1704)) - Get metadata for subqueries 2016-12-15 16:18:13,645 INFO parse.CalcitePlanner (SemanticAnalyzer.java:getMetaData(1728)) - Get metadata for destination tables 2016-12-15 16:18:13,645 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(746)) - 1: get_table : db=bigsql tbl=drivers 2016-12-15 16:18:13,645 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(371)) - ugi=admin ip=unknown-ip-addr cmd=get_table : db=bigsql tbl=drivers 2016-12-15 16:18:13,662 INFO parse.CalcitePlanner (SemanticAnalyzer.java:genResolvedParseTree(10078)) - Completed getting MetaData in Semantic Analysis 2016-12-15 16:18:13,663 INFO parse.BaseSemanticAnalyzer (CalcitePlanner.java:canCBOHandleAst(388)) - Not invoking CBO because the statement has too few joins 2016-12-15 16:18:13,664 INFO common.FileUtils (FileUtils.java:mkdir(501)) - Creating directory if it doesn't exist: hdfs://ibm-biginsight.com:8020/apps/hive/warehouse/bigsql.db/drivers/.hive-staging_hive_2016-12-15_16-18-13_587_7758687425733444141-1 2016-12-15 16:18:13,678 INFO parse.CalcitePlanner (SemanticAnalyzer.java:genFileSinkPlan(6653)) - Set stats collection dir : hdfs://ibm-biginsight.com:8020/apps/hive/warehouse/bigsql.db/drivers/.hive-staging_hive_2016-12-15_16-18-13_587_7758687425733444141-1/-ext-10001 2016-12-15 16:18:13,680 INFO ppd.OpProcFactory (OpProcFactory.java:process(655)) - Processing for FS(102) 2016-12-15 16:18:13,680 INFO ppd.OpProcFactory (OpProcFactory.java:process(655)) - Processing for SEL(101) 2016-12-15 16:18:13,680 INFO ppd.OpProcFactory (OpProcFactory.java:process(655)) - Processing for SEL(100) 2016-12-15 16:18:13,680 INFO ppd.OpProcFactory (OpProcFactory.java:process(382)) - Processing for TS(99) 2016-12-15 16:18:13,682 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,683 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,683 INFO optimizer.GenMRFileSink1 (GenMRFileSink1.java:process(103)) - using CombineHiveInputformat for the merge job 2016-12-15 16:18:13,684 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(746)) - 1: get_indexes : db=bigsql tbl=values__tmp__table__1 2016-12-15 16:18:13,684 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(371)) - ugi=admin ip=unknown-ip-addr cmd=get_indexes : db=bigsql tbl=values__tmp__table__1 2016-12-15 16:18:13,687 INFO metastore.HiveMetaStore (HiveMetaStore.java:logInfo(746)) - 1: get_indexes : db=bigsql tbl=values__tmp__table__1 2016-12-15 16:18:13,687 INFO HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(371)) - ugi=admin ip=unknown-ip-addr cmd=get_indexes : db=bigsql tbl=values__tmp__table__1 2016-12-15 16:18:13,689 INFO physical.NullScanTaskDispatcher (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans where optimization is applicable 2016-12-15 16:18:13,689 INFO physical.NullScanTaskDispatcher (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans 2016-12-15 16:18:13,689 INFO physical.NullScanTaskDispatcher (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans where optimization is applicable 2016-12-15 16:18:13,690 INFO physical.NullScanTaskDispatcher (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans 2016-12-15 16:18:13,690 INFO physical.NullScanTaskDispatcher (NullScanTaskDispatcher.java:dispatch(175)) - Looking for table scans where optimization is applicable 2016-12-15 16:18:13,690 INFO physical.NullScanTaskDispatcher (NullScanTaskDispatcher.java:dispatch(199)) - Found 0 null table scans 2016-12-15 16:18:13,691 INFO physical.Vectorizer (Vectorizer.java:validateMapWork(369)) - Validating MapWork... 2016-12-15 16:18:13,691 INFO physical.Vectorizer (Vectorizer.java:validateMapWork(397)) - Input format: org.apache.hadoop.mapred.TextInputFormat, doesn't provide vectorized input 2016-12-15 16:18:13,691 INFO parse.CalcitePlanner (SemanticAnalyzer.java:analyzeInternal(10213)) - Completed plan generation 2016-12-15 16:18:13,691 INFO ql.Driver (Driver.java:compile(436)) - Semantic Analysis Completed 2016-12-15 16:18:13,691 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,691 INFO ql.Driver (Driver.java:getSchema(240)) - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_col0, type:int, comment:null), FieldSchema(name:_col1, type:string, comment:null), FieldSchema(name:_col2, type:bigint, comment:null), FieldSchema(name:_col3, type:string, comment:null), FieldSchema(name:_col4, type:string, comment:null), FieldSchema(name:_col5, type:string, comment:null)], properties:null) 2016-12-15 16:18:13,691 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,693 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,693 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,693 INFO ql.Driver (Driver.java:checkConcurrency(160)) - Concurrency mode is disabled, not creating a lock manager 2016-12-15 16:18:13,693 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,693 INFO ql.Driver (Driver.java:execute(1328)) - Starting command(queryId=hive_20161215161813_8cb132ca-5a8a-4514-a3e9-03f32d039ef9): insert into drivers(driverid,name,ssn,location,certified,wageplan) values(1,'test',234555,'test','test','test') 2016-12-15 16:18:13,693 INFO hooks.ATSHook (ATSHook.java:(84)) - Created ATS Hook 2016-12-15 16:18:13,693 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,694 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,694 INFO ql.Driver (SessionState.java:printInfo(954)) - Query ID = hive_20161215161813_8cb132ca-5a8a-4514-a3e9-03f32d039ef9 2016-12-15 16:18:13,694 INFO ql.Driver (SessionState.java:printInfo(954)) - Total jobs = 3 2016-12-15 16:18:13,694 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:13,694 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,694 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,695 INFO ql.Driver (SessionState.java:printInfo(954)) - Launching Job 1 out of 3 2016-12-15 16:18:13,696 INFO ql.Driver (Driver.java:launchTask(1651)) - Starting task [Stage-1:MAPRED] in serial mode 2016-12-15 16:18:13,696 INFO exec.Task (SessionState.java:printInfo(954)) - Number of reduce tasks is set to 0 since there's no reduce operator 2016-12-15 16:18:13,701 INFO ql.Context (Context.java:getMRScratchDir(330)) - New scratch dir is hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12 2016-12-15 16:18:13,705 INFO hooks.ATSHook (ATSHook.java:createPreHookEvent(149)) - Received pre-hook notification for :hive_20161215161813_8cb132ca-5a8a-4514-a3e9-03f32d039ef9 2016-12-15 16:18:13,706 INFO mr.ExecDriver (ExecDriver.java:execute(288)) - Using org.apache.hadoop.hive.ql.io.CombineHiveInputFormat 2016-12-15 16:18:13,706 INFO mr.ExecDriver (ExecDriver.java:execute(310)) - adding libjars: file:///usr/iop/current/hbase-client/lib/hbase-client.jar,file:///usr/iop/current/hbase-client/lib/hbase-common.jar,file:///usr/iop/current/hbase-client/lib/hbase-hadoop2-compat.jar,file:///usr/iop/current/hbase-client/lib/hbase-prefix-tree.jar,file:///usr/iop/current/hbase-client/lib/hbase-protocol.jar,file:///usr/iop/current/hbase-client/lib/hbase-server.jar,file:///usr/iop/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar,file:///usr/iop/current/hive-webhcat/share/hcatalog 2016-12-15 16:18:13,707 INFO exec.Utilities (Utilities.java:getInputPaths(3398)) - Processing alias values__tmp__table__1 2016-12-15 16:18:13,707 INFO exec.Utilities (Utilities.java:getInputPaths(3415)) - Adding input file hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/_tmp_space.db/Values__Tmp__Table__1 2016-12-15 16:18:13,707 INFO exec.Utilities (Utilities.java:isEmptyPath(2699)) - Content Summary not cached for hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/_tmp_space.db/Values__Tmp__Table__1 2016-12-15 16:18:13,710 INFO ql.Context (Context.java:getMRScratchDir(330)) - New scratch dir is hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12 2016-12-15 16:18:13,715 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:13,715 INFO exec.Utilities (Utilities.java:serializePlan(939)) - Serializing MapWork via kryo 2016-12-15 16:18:14,140 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:14,142 ERROR mr.ExecDriver (ExecDriver.java:execute(400)) - yarn 2016-12-15 16:18:14,237 INFO impl.TimelineClientImpl (TimelineClientImpl.java:serviceInit(296)) - Timeline service address: http://ibm-biginsight.com:8188/ws/v1/timeline/ 2016-12-15 16:18:14,238 INFO client.RMProxy (RMProxy.java:createRMProxy(98)) - Connecting to ResourceManager at ibm-biginsight.com/10.10.15.15:8050 2016-12-15 16:18:14,242 INFO fs.FSStatsPublisher (FSStatsPublisher.java:init(49)) - created : hdfs://ibm-biginsight.com:8020/apps/hive/warehouse/bigsql.db/drivers/.hive-staging_hive_2016-12-15_16-18-13_587_7758687425733444141-1/-ext-10001 2016-12-15 16:18:14,336 INFO impl.TimelineClientImpl (TimelineClientImpl.java:serviceInit(296)) - Timeline service address: http://ibm-biginsight.com:8188/ws/v1/timeline/ 2016-12-15 16:18:14,337 INFO client.RMProxy (RMProxy.java:createRMProxy(98)) - Connecting to ResourceManager at ibm-biginsight.com/10.10.15.15:8050 2016-12-15 16:18:14,338 INFO exec.Utilities (Utilities.java:getBaseWork(391)) - PLAN PATH = hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12/-mr-10004/d2f6b1e8-31de-491c-ac47-1bfce86603a0/map.xml 2016-12-15 16:18:14,339 INFO exec.Utilities (Utilities.java:getBaseWork(391)) - PLAN PATH = hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12/-mr-10004/d2f6b1e8-31de-491c-ac47-1bfce86603a0/reduce.xml 2016-12-15 16:18:14,339 INFO exec.Utilities (Utilities.java:getBaseWork(401)) - ***************non-local mode*************** 2016-12-15 16:18:14,339 INFO exec.Utilities (Utilities.java:getBaseWork(405)) - local path = hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12/-mr-10004/d2f6b1e8-31de-491c-ac47-1bfce86603a0/reduce.xml 2016-12-15 16:18:14,339 INFO exec.Utilities (Utilities.java:getBaseWork(417)) - Open file to read in plan: hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12/-mr-10004/d2f6b1e8-31de-491c-ac47-1bfce86603a0/reduce.xml 2016-12-15 16:18:14,341 INFO exec.Utilities (Utilities.java:getBaseWork(457)) - File not found: File does not exist: /tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12/-mr-10004/d2f6b1e8-31de-491c-ac47-1bfce86603a0/reduce.xml at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71) at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1815) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1786) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1699) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:587) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043) 2016-12-15 16:18:14,341 INFO exec.Utilities (Utilities.java:getBaseWork(458)) - No plan file found: hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12/-mr-10004/d2f6b1e8-31de-491c-ac47-1bfce86603a0/reduce.xml 2016-12-15 16:18:14,348 WARN mapreduce.JobResourceUploader (JobResourceUploader.java:uploadFiles(64)) - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this. 2016-12-15 16:18:15,070 INFO log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - 2016-12-15 16:18:15,070 INFO exec.Utilities (Utilities.java:getBaseWork(391)) - PLAN PATH = hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/hive_2016-12-15_16-18-13_587_7758687425733444141-12/-mr-10004/d2f6b1e8-31de-491c-ac47-1bfce86603a0/map.xml 2016-12-15 16:18:15,070 INFO io.CombineHiveInputFormat (CombineHiveInputFormat.java:getSplits(517)) - Total number of paths: 1, launching 1 threads to check non-combinable ones. 2016-12-15 16:18:15,074 INFO io.CombineHiveInputFormat (CombineHiveInputFormat.java:getCombineSplits(439)) - CombineHiveInputSplit creating pool for hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/_tmp_space.db/Values__Tmp__Table__1; using filter path hdfs://ibm-biginsight.com:8020/tmp/hive/admin/42e486e9-fa21-45d9-8d85-2973cf01e66c/_tmp_space.db/Values__Tmp__Table__1 2016-12-15 16:18:15,078 INFO input.FileInputFormat (FileInputFormat.java:listStatus(283)) - Total input paths to process : 1 2016-12-15 16:18:15,079 INFO input.CombineFileInputFormat (CombineFileInputFormat.java:createSplits(413)) - DEBUG: Terminated node allocation with : CompletedNodes: 1, size left: 0 2016-12-15 16:18:15,079 INFO io.CombineHiveInputFormat (CombineHiveInputFormat.java:getCombineSplits(494)) - number of splits 1 2016-12-15 16:18:15,079 INFO io.CombineHiveInputFormat (CombineHiveInputFormat.java:getSplits(587)) - Number of all splits 1 2016-12-15 16:18:15,079 INFO log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - 2016-12-15 16:18:15,101 INFO mapreduce.JobSubmitter (JobSubmitter.java:submitJobInternal(198)) - number of splits:1 2016-12-15 16:18:15,120 INFO mapreduce.JobSubmitter (JobSubmitter.java:printTokens(287)) - Submitting tokens for job: job_1481788223653_0004 2016-12-15 16:18:15,367 INFO impl.YarnClientImpl (YarnClientImpl.java:submitApplication(273)) - Submitted application application_1481788223653_0004 2016-12-15 16:18:15,370 INFO mapreduce.Job (Job.java:submit(1294)) - The url to track the job: http://ibm-biginsight.com:8088/proxy/application_1481788223653_0004/ 2016-12-15 16:18:15,370 INFO exec.Task (SessionState.java:printInfo(954)) - Starting Job = job_1481788223653_0004, Tracking URL = http://ibm-biginsight.com:8088/proxy/application_1481788223653_0004/ 2016-12-15 16:18:15,370 INFO exec.Task (SessionState.java:printInfo(954)) - Kill Command = /usr/iop/4.1.0.0/hadoop/bin/hadoop job -kill job_1481788223653_0004 2016-12-15 16:18:18,963 WARN thrift.ThriftCLIService (ThriftCLIService.java:GetOperationStatus(621)) - Error getting operation status: org.apache.hive.service.cli.HiveSQLException: Invalid OperationHandle: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=1112a67b-1817-4720-bf04-40f60e456d51] at org.apache.hive.service.cli.operation.OperationManager.getOperation(OperationManager.java:151) at org.apache.hive.service.cli.CLIService.getOperationStatus(CLIService.java:375) at org.apache.hive.service.cli.thrift.ThriftCLIService.GetOperationStatus(ThriftCLIService.java:610) at org.apache.hive.service.cli.thrift.TCLIService$Processor$GetOperationStatus.getResult(TCLIService.java:1473) at org.apache.hive.service.cli.thrift.TCLIService$Processor$GetOperationStatus.getResult(TCLIService.java:1458) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-12-15 16:18:18,966 WARN thrift.ThriftCLIService (ThriftCLIService.java:FetchResults(681)) - Error fetching results: org.apache.hive.service.cli.HiveSQLException: Invalid OperationHandle: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=1112a67b-1817-4720-bf04-40f60e456d51] at org.apache.hive.service.cli.operation.OperationManager.getOperation(OperationManager.java:151) at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:454) at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:672) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1553) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1538) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-12-15 16:18:29,152 WARN thrift.ThriftCLIService (ThriftCLIService.java:GetOperationStatus(621)) - Error getting operation status: org.apache.hive.service.cli.HiveSQLException: Invalid OperationHandle: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=1112a67b-1817-4720-bf04-40f60e456d51] at org.apache.hive.service.cli.operation.OperationManager.getOperation(OperationManager.java:151) at org.apache.hive.service.cli.CLIService.getOperationStatus(CLIService.java:375) at org.apache.hive.service.cli.thrift.ThriftCLIService.GetOperationStatus(ThriftCLIService.java:610) at org.apache.hive.service.cli.thrift.TCLIService$Processor$GetOperationStatus.getResult(TCLIService.java:1473) at org.apache.hive.service.cli.thrift.TCLIService$Processor$GetOperationStatus.getResult(TCLIService.java:1458) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-12-15 16:18:29,154 WARN thrift.ThriftCLIService (ThriftCLIService.java:FetchResults(681)) - Error fetching results: org.apache.hive.service.cli.HiveSQLException: Invalid OperationHandle: OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=1112a67b-1817-4720-bf04-40f60e456d51] at org.apache.hive.service.cli.operation.OperationManager.getOperation(OperationManager.java:151) at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:454) at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:672) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1553) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1538) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2016-12-15 16:18:36,146 INFO thrift.ThriftCLIService (ThriftCLIService.java:OpenSession(294)) - Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V8 2016-12-15 16:18:36,160 INFO session.SessionState (SessionState.java:createPath(641)) - Created local directory: /tmp/6ec1cb22-8263-44b8-b6f4-e7688a0ea95b_resources 2016-12-15 16:18:36,163 INFO session.SessionState (SessionState.java:createPath(641)) - Created HDFS directory: /tmp/hive/anonymous/6ec1cb22-8263-44b8-b6f4-e7688a0ea95b 2016-12-15 16:18:36,165 INFO session.SessionState (SessionState.java:createPath(641)) - Created local directory: /tmp/hive/6ec1cb22-8263-44b8-b6f4-e7688a0ea95b 2016-12-15 16:18:36,167 INFO session.SessionState (SessionState.java:createPath(641)) - Created HDFS directory: /tmp/hive/anonymous/6ec1cb22-8263-44b8-b6f4-e7688a0ea95b/_tmp_space.db 2016-12-15 16:18:36,169 INFO session.HiveSessionImpl (HiveSessionImpl.java:setOperationLogSessionDir(236)) - Operation log session directory is created: /tmp/hive/operation_logs/6ec1cb22-8263-44b8-b6f4-e7688a0ea95b