Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here. Want to know more about what has changed? Check out the Community News blog.

Druid insert throws vertex error with Hive LLAP in HDP 3.1.0

Highlighted

Druid insert throws vertex error with Hive LLAP in HDP 3.1.0

New Contributor

Connected to LLAP string , Insert statement throws vertex error

Borker and coordinator configuration properties are set properly only, still this vertex error throws, Please help


0: jdbc:hive2://ipadress:2181> CREATE EXTERNAL TABLE upgdevdb1.hivedruid_test5 (`__time` TIMESTAMP, `course` STRING, `id` STRING, `name` STRING, `year` STRING) Stored BY 'org.apache.hadoop.hive.druid.DruidStorageHandler';

INFO : Compiling command(queryId=hive_20190711193720_8376c2c0-bd72-413d-8c81-033ed55e94d9): CREATE EXTERNAL TABLE upgdevdb1.hivedruid_test5 (`__time` TIMESTAMP, `course` STRING, `id` STRING, `name` STRING, `year` STRING) Stored BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'

INFO : Semantic Analysis Completed (retrial = false)

INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)

INFO : Completed compiling command(queryId=hive_20190711193720_8376c2c0-bd72-413d-8c81-033ed55e94d9); Time taken: 0.028 seconds

INFO : Executing command(queryId=hive_20190711193720_8376c2c0-bd72-413d-8c81-033ed55e94d9): CREATE EXTERNAL TABLE upgdevdb1.hivedruid_test5 (`__time` TIMESTAMP, `course` STRING, `id` STRING, `name` STRING, `year` STRING) Stored BY 'org.apache.hadoop.hive.druid.DruidStorageHandler'

INFO : Starting task [Stage-0:DDL] in serial mode

INFO : Completed executing command(queryId=hive_20190711193720_8376c2c0-bd72-413d-8c81-033ed55e94d9); Time taken: 0.129 seconds

INFO : OK

No rows affected (0.174 seconds)

0: jdbc:hive2://ipaddress:2181> insert into hivedruid_test5 values(1541193507549,'anil','1','test','2016');

INFO : Compiling command(queryId=hive_20190711193731_3b4cf0a3-b3a3-4701-a1f3-5d8839b18ee7): insert into hivedruid_test5 values(1541193507549,'anil','1','test','2016')

INFO : Semantic Analysis Completed (retrial = false)

INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_col0, type:timestamp, comment:null), FieldSchema(name:_col1, type:string, comment:null), FieldSchema(name:_col2, type:string, comment:null), FieldSchema(name:_col3, type:string, comment:null), FieldSchema(name:_col4, type:string, comment:null)], properties:null)

INFO : Completed compiling command(queryId=hive_20190711193731_3b4cf0a3-b3a3-4701-a1f3-5d8839b18ee7); Time taken: 0.405 seconds

INFO : Executing command(queryId=hive_20190711193731_3b4cf0a3-b3a3-4701-a1f3-5d8839b18ee7): insert into hivedruid_test5 values(1541193507549,'anil','1','test','2016')

INFO : Query ID = hive_20190711193731_3b4cf0a3-b3a3-4701-a1f3-5d8839b18ee7

INFO : Total jobs = 1

INFO : Starting task [Stage-0:DDL] in serial mode

INFO : Starting task [Stage-1:DDL] in serial mode

INFO : Launching Job 1 out of 1

INFO : Starting task [Stage-2:MAPRED] in serial mode

INFO : Subscribed to counters: [] for queryId: hive_20190711193731_3b4cf0a3-b3a3-4701-a1f3-5d8839b18ee7

INFO : Session is already open

INFO : Dag name: insert into hivedruid...','1','test','2016') (Stage-2)

ERROR : Status: Failed

ERROR : Vertex failed, vertexName=Reducer 2, vertexId=vertex_1562408971899_20433_98_01, diagnostics=[Task failed, taskId=task_1562408971899_20433_98_01_000007, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : java.lang.RuntimeException: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:401)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:249)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:318)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)

at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)

at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)

at org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:110)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.createUTC(FormatConfig.java:71)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.<clinit>(FormatConfig.java:23)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:19)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:24)

at org.apache.hive.druid.io.druid.jackson.JodaStuff.register(JodaStuff.java:54)

at org.apache.hive.druid.io.druid.jackson.DruidDefaultSerializersModule.<init>(DruidDefaultSerializersModule.java:49)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:46)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:35)

at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.<clinit>(DruidStorageHandlerUtils.java:227)

at org.apache.hadoop.hive.druid.io.DruidOutputFormat.getHiveRecordWriter(DruidOutputFormat.java:95)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:297)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:282)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:786)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:737)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:903)

at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)

at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:965)

at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:938)

at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.processVectorGroup(ReduceRecordSource.java:490)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:392)

... 18 more

, errorMessage=Cannot recover from this error:java.lang.RuntimeException: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:401)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:249)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:318)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)

at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)

at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)

at org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:110)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.createUTC(FormatConfig.java:71)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.<clinit>(FormatConfig.java:23)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:19)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:24)

at org.apache.hive.druid.io.druid.jackson.JodaStuff.register(JodaStuff.java:54)

at org.apache.hive.druid.io.druid.jackson.DruidDefaultSerializersModule.<init>(DruidDefaultSerializersModule.java:49)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:46)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:35)

at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.<clinit>(DruidStorageHandlerUtils.java:227)

at org.apache.hadoop.hive.druid.io.DruidOutputFormat.getHiveRecordWriter(DruidOutputFormat.java:95)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:297)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:282)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:786)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:737)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:903)

at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)

at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:965)

at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:938)

at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.processVectorGroup(ReduceRecordSource.java:490)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:392)

... 18 more

]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1562408971899_20433_98_01 [Reducer 2] killed/failed due to:OWN_TASK_FAILURE]

ERROR : DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0

INFO : org.apache.tez.common.counters.DAGCounter:

INFO : NUM_FAILED_TASKS: 1

INFO : NUM_SUCCEEDED_TASKS: 113

INFO : TOTAL_LAUNCHED_TASKS: 114

INFO : AM_CPU_MILLISECONDS: 260

INFO : AM_GC_TIME_MILLIS: 0

INFO : File System Counters:

INFO : FILE_BYTES_READ: 0

INFO : FILE_BYTES_WRITTEN: 2780

INFO : FILE_READ_OPS: 0

INFO : FILE_LARGE_READ_OPS: 0

INFO : FILE_WRITE_OPS: 0

INFO : HDFS_BYTES_READ: 0

INFO : HDFS_BYTES_WRITTEN: 0

INFO : HDFS_READ_OPS: 0

INFO : HDFS_LARGE_READ_OPS: 0

INFO : HDFS_WRITE_OPS: 0

INFO : org.apache.tez.common.counters.TaskCounter:

INFO : REDUCE_INPUT_GROUPS: 0

INFO : REDUCE_INPUT_RECORDS: 0

INFO : COMBINE_INPUT_RECORDS: 0

INFO : SPILLED_RECORDS: 1

INFO : NUM_SHUFFLED_INPUTS: 0

INFO : NUM_SKIPPED_INPUTS: 112

INFO : NUM_FAILED_SHUFFLE_INPUTS: 0

INFO : MERGED_MAP_OUTPUTS: 0

INFO : TASK_DURATION_MILLIS: 3672

INFO : INPUT_RECORDS_PROCESSED: 4

INFO : INPUT_SPLIT_LENGTH_BYTES: 1

INFO : OUTPUT_RECORDS: 1

INFO : OUTPUT_LARGE_RECORDS: 0

INFO : OUTPUT_BYTES: 38

INFO : OUTPUT_BYTES_WITH_OVERHEAD: 46

INFO : OUTPUT_BYTES_PHYSICAL: 60

INFO : ADDITIONAL_SPILLS_BYTES_WRITTEN: 0

INFO : ADDITIONAL_SPILLS_BYTES_READ: 0

INFO : ADDITIONAL_SPILL_COUNT: 0

INFO : SHUFFLE_CHUNK_COUNT: 1

INFO : SHUFFLE_BYTES: 0

INFO : SHUFFLE_BYTES_DECOMPRESSED: 0

INFO : SHUFFLE_BYTES_TO_MEM: 0

INFO : SHUFFLE_BYTES_TO_DISK: 0

INFO : SHUFFLE_BYTES_DISK_DIRECT: 0

INFO : NUM_MEM_TO_DISK_MERGES: 0

INFO : NUM_DISK_TO_DISK_MERGES: 0

INFO : SHUFFLE_PHASE_TIME: 261

INFO : MERGE_PHASE_TIME: 284

INFO : FIRST_EVENT_RECEIVED: 247

INFO : LAST_EVENT_RECEIVED: 247

INFO : HIVE:

INFO : DESERIALIZE_ERRORS: 0

INFO : RECORDS_IN_Map_1: 3

INFO : RECORDS_OUT_1_upgdevdb1.hivedruid_test5: 0

INFO : RECORDS_OUT_INTERMEDIATE_Map_1: 1

INFO : RECORDS_OUT_INTERMEDIATE_Reducer_2: 0

INFO : RECORDS_OUT_OPERATOR_FS_10: 0

INFO : RECORDS_OUT_OPERATOR_MAP_0: 0

INFO : RECORDS_OUT_OPERATOR_RS_7: 1

INFO : RECORDS_OUT_OPERATOR_SEL_1: 1

INFO : RECORDS_OUT_OPERATOR_SEL_3: 1

INFO : RECORDS_OUT_OPERATOR_SEL_6: 1

INFO : RECORDS_OUT_OPERATOR_SEL_9: 0

INFO : RECORDS_OUT_OPERATOR_TS_0: 1

INFO : RECORDS_OUT_OPERATOR_UDTF_2: 1

INFO : Shuffle Errors:

INFO : BAD_ID: 0

INFO : CONNECTION: 0

INFO : IO_ERROR: 0

INFO : WRONG_LENGTH: 0

INFO : WRONG_MAP: 0

INFO : WRONG_REDUCE: 0

INFO : Shuffle Errors_Reducer_2_INPUT_Map_1:

INFO : BAD_ID: 0

INFO : CONNECTION: 0

INFO : IO_ERROR: 0

INFO : WRONG_LENGTH: 0

INFO : WRONG_MAP: 0

INFO : WRONG_REDUCE: 0

INFO : TaskCounter_Map_1_INPUT__dummy_table:

INFO : INPUT_RECORDS_PROCESSED: 4

INFO : INPUT_SPLIT_LENGTH_BYTES: 1

INFO : TaskCounter_Map_1_OUTPUT_Reducer_2:

INFO : ADDITIONAL_SPILLS_BYTES_READ: 0

INFO : ADDITIONAL_SPILLS_BYTES_WRITTEN: 0

INFO : ADDITIONAL_SPILL_COUNT: 0

INFO : OUTPUT_BYTES: 38

INFO : OUTPUT_BYTES_PHYSICAL: 60

INFO : OUTPUT_BYTES_WITH_OVERHEAD: 46

INFO : OUTPUT_LARGE_RECORDS: 0

INFO : OUTPUT_RECORDS: 1

INFO : SHUFFLE_CHUNK_COUNT: 1

INFO : SPILLED_RECORDS: 1

INFO : TaskCounter_Reducer_2_INPUT_Map_1:

INFO : ADDITIONAL_SPILLS_BYTES_READ: 0

INFO : ADDITIONAL_SPILLS_BYTES_WRITTEN: 0

INFO : COMBINE_INPUT_RECORDS: 0

INFO : FIRST_EVENT_RECEIVED: 247

INFO : LAST_EVENT_RECEIVED: 247

INFO : MERGED_MAP_OUTPUTS: 0

INFO : MERGE_PHASE_TIME: 284

INFO : NUM_DISK_TO_DISK_MERGES: 0

INFO : NUM_FAILED_SHUFFLE_INPUTS: 0

INFO : NUM_MEM_TO_DISK_MERGES: 0

INFO : NUM_SHUFFLED_INPUTS: 0

INFO : NUM_SKIPPED_INPUTS: 112

INFO : REDUCE_INPUT_GROUPS: 0

INFO : REDUCE_INPUT_RECORDS: 0

INFO : SHUFFLE_BYTES: 0

INFO : SHUFFLE_BYTES_DECOMPRESSED: 0

INFO : SHUFFLE_BYTES_DISK_DIRECT: 0

INFO : SHUFFLE_BYTES_TO_DISK: 0

INFO : SHUFFLE_BYTES_TO_MEM: 0

INFO : SHUFFLE_PHASE_TIME: 261

INFO : SPILLED_RECORDS: 0

INFO : TaskCounter_Reducer_2_OUTPUT_out_Reducer_2:

INFO : OUTPUT_RECORDS: 0

INFO : org.apache.hadoop.hive.llap.counters.LlapWmCounters:

INFO : GUARANTEED_QUEUED_NS: 0

INFO : GUARANTEED_RUNNING_NS: 0

INFO : SPECULATIVE_QUEUED_NS: 5345368

INFO : SPECULATIVE_RUNNING_NS: 1908978794

ERROR : FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Reducer 2, vertexId=vertex_1562408971899_20433_98_01, diagnostics=[Task failed, taskId=task_1562408971899_20433_98_01_000007, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : java.lang.RuntimeException: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:401)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:249)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:318)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)

at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)

at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)

at org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:110)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.createUTC(FormatConfig.java:71)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.<clinit>(FormatConfig.java:23)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:19)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:24)

at org.apache.hive.druid.io.druid.jackson.JodaStuff.register(JodaStuff.java:54)

at org.apache.hive.druid.io.druid.jackson.DruidDefaultSerializersModule.<init>(DruidDefaultSerializersModule.java:49)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:46)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:35)

at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.<clinit>(DruidStorageHandlerUtils.java:227)

at org.apache.hadoop.hive.druid.io.DruidOutputFormat.getHiveRecordWriter(DruidOutputFormat.java:95)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:297)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:282)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:786)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:737)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:903)

at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)

at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:965)

at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:938)

at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.processVectorGroup(ReduceRecordSource.java:490)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:392)

... 18 more

, errorMessage=Cannot recover from this error:java.lang.RuntimeException: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:401)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:249)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:318)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)

at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)

at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)

at org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:110)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.createUTC(FormatConfig.java:71)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.<clinit>(FormatConfig.java:23)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:19)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:24)

at org.apache.hive.druid.io.druid.jackson.JodaStuff.register(JodaStuff.java:54)

at org.apache.hive.druid.io.druid.jackson.DruidDefaultSerializersModule.<init>(DruidDefaultSerializersModule.java:49)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:46)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:35)

at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.<clinit>(DruidStorageHandlerUtils.java:227)

at org.apache.hadoop.hive.druid.io.DruidOutputFormat.getHiveRecordWriter(DruidOutputFormat.java:95)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:297)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:282)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:786)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:737)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:903)

at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)

at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:965)

at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:938)

at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.processVectorGroup(ReduceRecordSource.java:490)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:392)

... 18 more

]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1562408971899_20433_98_01 [Reducer 2] killed/failed due to:OWN_TASK_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0

INFO : Completed executing command(queryId=hive_20190711193731_3b4cf0a3-b3a3-4701-a1f3-5d8839b18ee7); Time taken: 0.596 seconds

Error: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Reducer 2, vertexId=vertex_1562408971899_20433_98_01, diagnostics=[Task failed, taskId=task_1562408971899_20433_98_01_000007, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : java.lang.RuntimeException: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:401)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:249)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:318)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)

at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)

at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)

at org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:110)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.createUTC(FormatConfig.java:71)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.<clinit>(FormatConfig.java:23)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:19)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:24)

at org.apache.hive.druid.io.druid.jackson.JodaStuff.register(JodaStuff.java:54)

at org.apache.hive.druid.io.druid.jackson.DruidDefaultSerializersModule.<init>(DruidDefaultSerializersModule.java:49)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:46)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:35)

at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.<clinit>(DruidStorageHandlerUtils.java:227)

at org.apache.hadoop.hive.druid.io.DruidOutputFormat.getHiveRecordWriter(DruidOutputFormat.java:95)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:297)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:282)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:786)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:737)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:903)

at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)

at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:965)

at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:938)

at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.processVectorGroup(ReduceRecordSource.java:490)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:392)

... 18 more

, errorMessage=Cannot recover from this error:java.lang.RuntimeException: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:401)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:249)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:318)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:267)

at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:250)

at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)

at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)

at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)

at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)

at org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:110)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.createUTC(FormatConfig.java:71)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.cfg.FormatConfig.<clinit>(FormatConfig.java:23)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:19)

at org.apache.hive.druid.com.fasterxml.jackson.datatype.joda.deser.PeriodDeserializer.<init>(PeriodDeserializer.java:24)

at org.apache.hive.druid.io.druid.jackson.JodaStuff.register(JodaStuff.java:54)

at org.apache.hive.druid.io.druid.jackson.DruidDefaultSerializersModule.<init>(DruidDefaultSerializersModule.java:49)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:46)

at org.apache.hive.druid.io.druid.jackson.DefaultObjectMapper.<init>(DefaultObjectMapper.java:35)

at org.apache.hadoop.hive.druid.DruidStorageHandlerUtils.<clinit>(DruidStorageHandlerUtils.java:227)

at org.apache.hadoop.hive.druid.io.DruidOutputFormat.getHiveRecordWriter(DruidOutputFormat.java:95)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:297)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:282)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:786)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:737)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:903)

at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)

at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:965)

at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:938)

at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.processVectorGroup(ReduceRecordSource.java:490)

at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecordVector(ReduceRecordSource.java:392)

... 18 more

]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1562408971899_20433_98_01 [Reducer 2] killed/failed due to:OWN_TASK_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 (state=08S01,code=2)


1 REPLY 1

Re: Druid insert throws vertex error with Hive LLAP in HDP 3.1.0

New Contributor

Caused by: java.lang.NoSuchMethodError: org.joda.time.format.DateTimeFormatter.withZoneUTC()Lorg/joda/time/format/DateTimeFormatter;