Support Questions

Find answers, ask questions, and share your expertise

java.io.IOException: Previous writer likely failed to write hdfs://sandbox.hortonworks.com:8020/tmp/hive/hive/_tez_session_dir/cfcd5d27-c46c-440e-b6e2-51bf35bcbf43/hive-hcatalog-core.jar. Failing because I am unlikely to write too

avatar
Expert Contributor

Hi All,

I've a Kerberized HDP 2.4 cluster, and i'm trying to access Hive on command line.

I'm logged in as - root, got the kerberos token for user - hive

On hive command, i get the following error -

Any ideas ?

------------------------------------------------------------------

[root@sandbox ~]# hive WARNING: Use "yarn jar" to launch YARN applications. Logging initialized using configuration in file:/etc/hive/2.4.0.0-169/0/hive-log4j.properties Exception in thread "main" java.lang.RuntimeException: java.io.IOException: Previous writer likely failed to write hdfs://sandbox.hortonworks.com:8020/tmp/hive/hive/_tez_session_dir/cfcd5d27-c46c-440e-b6e2-51bf35bcbf43/hive-hcatalog-core.jar. Failing because I am unlikely to write too. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:507) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:680) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.io.IOException: Previous writer likely failed to write hdfs://sandbox.hortonworks.com:8020/tmp/hive/hive/_tez_session_dir/cfcd5d27-c46c-440e-b6e2-51bf35bcbf43/hive-hcatalog-core.jar. Failing because I am unlikely to write too. at org.apache.hadoop.hive.ql.exec.tez.DagUtils.localizeResource(DagUtils.java:982) at org.apache.hadoop.hive.ql.exec.tez.DagUtils.addTempResources(DagUtils.java:862) at org.apache.hadoop.hive.ql.exec.tez.DagUtils.localizeTempFilesFromConf(DagUtils.java:805) at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.refreshLocalResourcesFromConf(TezSessionState.java:233) at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:158) at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:117) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:504) ... 8 more [root@sandbox ~]# klist Ticket cache: FILE:/tmp/krb5cc_0 Default principal: hive/sandbox.hortonworks.com@EXAMPLE.COM Valid starting Expires Service principal 11/30/16 02:03:26 12/01/16 02:03:26 krbtgt/EXAMPLE.COM@EXAMPLE.COM renew until 11/30/16 02:03:26

4 REPLIES 4

avatar
Expert Contributor

Pls note -

The namenode was in Safe mode, hence i removed the safe mode.

Apart from this, no other changes done.

avatar
Expert Contributor

@Neeraj Sabharwal, @Kibrom Gebrehiwot, @Laurent Edel - any ideas on this issue ?

avatar
Expert Contributor

@Neeraj Sabharwal @Laurent Edel

for now, i've changed the execution engine to Map Reduce - to remove the problem,,

but the getting the following error on accessing the Hive View.

https://community.hortonworks.com/questions/69550/kerberized-hdp-24-not-able-to-access-hive-view.htm...

Any ideas ?

avatar
Contributor

This error usually indicates you have defined aux jars in hive-site.xml. For now (HDP 2.5 and below), aux jars need to be set in the client rather than as a server-side property when using Hive on Tez. There is an improvement request tracking this.