Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

[HDP 2.5 / Hive 2.1 / Beeline] Why ADD JAR does not work with hive 2.1 ?

Highlighted

[HDP 2.5 / Hive 2.1 / Beeline] Why ADD JAR does not work with hive 2.1 ?

New Contributor

Hi,

I'm testing Hive 2.1 (with beeline) and I have a problem when I want to use my custom UDF. With hive 1.2.1 ADD JAR works but not with hive 2.1.

The ADD JAR command is ok:

ADD JAR hdfs:////user/test/lib/my-custom-format-0.0.1-SNAPSHOT.jar; INFO : Added [/tmp/a8f12df6-6e85-4272-8431-9cdecc2edb61_resources/my-custom-format-0.0.1-SNAPSHOT.jar] to class path INFO : Added resources: [hdfs:////user/test/lib/my-custom-format-0.0.1-SNAPSHOT.jar] No rows affected (0.121 seconds)

But when I use the UDF, I have a ClassNotFoundException:

Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: com.test.hive.format.MyCustomInputFileFormat Serialization trace: inputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc) aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork) at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156) at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:181) at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326) at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:199) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:176) at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161) at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:214) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686) at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:206) at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:585) at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:494) at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:471) at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:416) ... 21 more Caused by: java.lang.ClassNotFoundException: com.test.hive.format.MyCustomInputFileFormat at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154) ... 44 more ]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0, Vertex vertex_1487758716484_0014_7_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]Vertex killed, vertexName=Reducer 2, vertexId=vertex_1487758716484_0014_7_01, diagnostics=[Vertex received Kill while in RUNNING state., Vertex did not succeed due to OTHER_VERTEX_FAILURE, failedTasks:0 killedTasks:9, Vertex vertex_1487758716484_0014_7_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1 (state=08S01,code=2)

The same test works with Hive 1.2.1.

Have you got an idea ?

Thanks for your help

4 REPLIES 4
Highlighted

Re: [HDP 2.5 / Hive 2.1 / Beeline] Why ADD JAR does not work with hive 2.1 ?

Explorer

The jar path you typed is wrong path pattern.

ADD JAR hdfs:////user/test/lib/my-custom-format-0.0.1-SNAPSHOT.jar

It's not working properly.

Try it again below.

Add a jar in local path, then file:///home/username/some_lib/...jar or just use /home/username/some_lib/...jar.

Add a jar in hdfs path, then hdfs://namenode_fqdn:port(8020)/user/username/lib/...jar or if you setup the namenode HA, just use the nameserviceid such as hdfs:///nameserviceid/user/username/lib/...jar.

Highlighted

Re: [HDP 2.5 / Hive 2.1 / Beeline] Why ADD JAR does not work with hive 2.1 ?

Cloudera Employee

you can check which exact version of hive-jdbc your hiveserver2 actually uses

terminal login your hiveserver2 host, and type

netstat -pan |grep 10000

or

ps -ef |grep hiveserver2

to find the process ID of hiveserver2, then check the classes by

lsof -p <processID> | grep hive-jdbc

to testify the version of hive-jdbc

Highlighted

Re: [HDP 2.5 / Hive 2.1 / Beeline] Why ADD JAR does not work with hive 2.1 ?

Super Guru

Re: [HDP 2.5 / Hive 2.1 / Beeline] Why ADD JAR does not work with hive 2.1 ?

Explorer
Don't have an account?
Coming from Hortonworks? Activate your account here