Support Questions

Find answers, ask questions, and share your expertise

Hive running on TEZ INSERT INTO fails with the following exception but select works ok

avatar
Expert Contributor
INFO : Tez session hasn't been created yet. Opening sessionINFO : Dag name: insert into default.test(node...VALUES('dd')(Stage-1)INFO : Dag submit failed due to org.apache.hadoop.fs.FSOutputSummer.<init>(Ljava/util/zip/Checksum;II)Vat org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1340)at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1369)at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1401)at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1382)at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1307)at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:384)at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:380)at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:380)at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:324)at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:909)at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:890)at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:852)at org.apache.tez.dag.history.recovery.RecoveryService.handleSummaryEvent(RecoveryService.java:393)at org.apache.tez.dag.history.recovery.RecoveryService.handle(RecoveryService.java:310)at org.apache.tez.dag.history.HistoryEventHandler.handleCriticalEvent(HistoryEventHandler.java:104)at org.apache.tez.dag.app.DAGAppMaster.startDAG(DAGAppMaster.java:2204)at org.apache.tez.dag.app.DAGAppMaster.submitDAGToAppMaster(DAGAppMaster.java:1225)at org.apache.tez.dag.api.client.DAGClientHandler.submitDAG(DAGClientHandler.java:118)at org.apache.tez.dag.api.client.rpc.DAGClientAMProtocolBlockingPBServerImpl.submitDAG(DAGClientAMProtocolBlockingPBServerImpl.java:163)at org.apache.tez.dag.api.client.rpc.DAGClientAMProtocolRPC$DAGClientAMProtocol$2.callBlockingMethod(DAGClientAMProtocolRPC.java:7471)at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145)stack trace: [org.apache.hadoop.ipc.Client.call(Client.java:1427), org.apache.hadoop.ipc.Client.call(Client.java:1358), org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229), com.sun.proxy.$Proxy43.submitDAG(Unknown Source), org.apache.tez.client.TezClient.submitDAGSession(TezClient.java:517), org.apache.tez.client.TezClient.submitDAG(TezClient.java:434), org.apache.hadoop.hive.ql.exec.tez.TezTask.submit(TezTask.java:439), org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:180), org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160), org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89), org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1720), org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1477), org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1254), org.apache.hadoop.hive.ql.Driver.run(Driver.java:1118), org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113), org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154), org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71), org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206), java.security.AccessController.doPrivileged(Native Method), javax.security.auth.Subject.doAs(Subject.java:422), org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657), org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218), java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511), java.util.concurrent.FutureTask.run(FutureTask.java:266), java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142), java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617), java.lang.Thread.run(Thread.java:745)] retrying...ERROR : Failed to execute tez graph.org.apache.hadoop.ipc.RemoteException(java.lang.NoSuchMethodError): org.apache.hadoop.fs.FSOutputSummer.<init>(Ljava/util/zip/Checksum;II)V

...any ideas on how to debug - the logs tell me nothing.

Mike

1 ACCEPTED SOLUTION

avatar
Guru

Are you able to run any other queries with Tez? Looking at exceptions, you seem to run into some environment issue with wrong versions of jars. Please give more details about your environment (HDP version, OS details etc). Is this the state after an upgrade or were you never able to run a Tez query on this environment?

View solution in original post

6 REPLIES 6

avatar
Expert Contributor

Googled this line :

INFO : Dag submit failed due to org.apache.hadoop.fs.FSOutputSummer

But couldn't find anything.

avatar
Expert Contributor

..also when I run Hive on MapReduce it works fine.

avatar
Guru

Are you able to run any other queries with Tez? Looking at exceptions, you seem to run into some environment issue with wrong versions of jars. Please give more details about your environment (HDP version, OS details etc). Is this the state after an upgrade or were you never able to run a Tez query on this environment?

avatar
Expert Contributor

HDP 2.4.0.0-169

OS: Ubuntu 14.04

The only change I have recently made is in upgrading Java from 1.7 to 1.8.0_91

avatar
Guru

This points to a jar version mismatch. That class is in hadoop-common.jar. Have you manually copied any jars? Please check if all hadoop-common jars are at version 2.7.1.2.4.0.0-169

avatar
Expert Contributor

This was it. I had put some hive serde UDF jar files into the classpath with dependencies that caused a mismatch. Thanks for pointing me in the right direction.