Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Loading data to Hive via Pig - org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found

avatar

Hi I am trying to load a Hive table from a Pig relation. It fails with the error log

C = LOAD 'ml-1m/output0121' Using PigStorage(',') as (id:int,title:chararray,genre:chararray);
store C into 'gidb.movies123' Using org.apache.hive.hcatalog.pig.HCatStorer();

The pig session was started using

pig -useHCatalog  

It fails with below log. The message is clear org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found.

I have added Registered the JARs in Hive and LIST JARS shows the same as well. Please let me know what I am missing. Thanks in advance

hive> LIST JARS    > ;

/tmp/b4cff5e9-2695-4658-b82c-798d6465227b_resources/jars

/tmp/b4cff5e9-2695-4658-b82c-798d6465227b_resources/hive-contrib-0.10.0.jar
2017-01-22 05:22:37,930 [main] INFO  hive.metastore - Trying to connect to metastore with URI thrift://ip....
2017-01-22 05:22:37,940 [main] INFO  hive.metastore - Connected to metastore.
2017-01-22 05:22:37,962 [main] ERROR hive.log - error in initSerDe: java.lang.ClassNotFoundException Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not f
ound
java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found
        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:395)
        at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
        at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
        at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
        at org.apache.hive.hcatalog.common.HCatUtil.extractSchema(HCatUtil.java:158)
        at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:179)
        at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:70)
        at org.apache.hive.hcatalog.pig.HCatStorer.setStoreLocation(HCatStorer.java:191)
        at org.apache.pig.newplan.logical.visitor.InputOutputFileValidatorVisitor.visit(InputOutputFileValidatorVisitor.java:57)
        at org.apache.pig.newplan.logical.relational.LOStore.accept(LOStore.java:66)
        at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:64)
        at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
        at org.apache.pig.newplan.DepthFirstWalker.walk(DepthFirstWalker.java:53)
        at org.apache.pig.newplan.PlanVisitor.visit(PlanVisitor.java:52)
        at org.apache.pig.newplan.logical.relational.LogicalPlan.validate(LogicalPlan.java:212)
        at org.apache.pig.PigServer$Graph.compile(PigServer.java:1808)
        at org.apache.pig.PigServer$Graph.access$300(PigServer.java:1484)
        at org.apache.pig.PigServer.execute(PigServer.java:1397)
        at org.apache.pig.PigServer.access$500(PigServer.java:116)
        at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1730)
        at org.apache.pig.PigServer.registerQuery(PigServer.java:664)
        at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1082)
        at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:505)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
        at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
        at org.apache.pig.Main.run(Main.java:565)
        at org.apache.pig.Main.main(Main.java:177)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
2017-01-22 05:22:37,962 [main] ERROR hive.ql.metadata.Table - Unable to get field from serde: org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe
java.lang.RuntimeException: MetaException(message:java.lang.ClassNotFoundException Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found)
        at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:278)
        at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
        at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
        at org.apache.hive.hcatalog.common.HCatUtil.extractSchema(HCatUtil.java:158)
        at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:179)
        at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:70)
        at org.apache.hive.hcatalog.pig.HCatStorer.setStoreLocation(HCatStorer.java:191)
        at org.apache.pig.newplan.logical.visitor.InputOutputFileValidatorVisitor.visit(InputOutputFileValidatorVisitor.java:57)
        at org.apache.pig.newplan.logical.relational.LOStore.accept(LOStore.java:66)
        at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:64)
        at org.apache.pig.newplan.DepthFirstWalker.depthFirst(DepthFirstWalker.java:66)
        at org.apache.pig.newplan.DepthFirstWalker.walk(DepthFirstWalker.java:53)
        at org.apache.pig.newplan.PlanVisitor.visit(PlanVisitor.java:52)
        at org.apache.pig.newplan.logical.relational.LogicalPlan.validate(LogicalPlan.java:212)
        at org.apache.pig.PigServer$Graph.compile(PigServer.java:1808)
        at org.apache.pig.PigServer$Graph.access$300(PigServer.java:1484)
        at org.apache.pig.PigServer.execute(PigServer.java:1397)
        at org.apache.pig.PigServer.access$500(PigServer.java:116)
        at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1730)
        at org.apache.pig.PigServer.registerQuery(PigServer.java:664)
        at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1082)
        at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:505)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
        at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
        at org.apache.pig.Main.run(Main.java:565)
        at org.apache.pig.Main.main(Main.java:177)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: MetaException(message:java.lang.ClassNotFoundException Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:409)
        at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
        ... 32 more
2017-01-22 05:22:37,965 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1115: org.apache.hive.hcatalog.common.HCatException : 2001 : Error setting output informat
ion. Cause : java.io.IOException: Failed to load foster storage handler
1 ACCEPTED SOLUTION

avatar

@Jay SenSharma Thanks for your pointers regarding the JAR file. I have added hive-contrib.jar alone instead of hive-contrib-1.2.1.2.3.4.0-3485.jar and it fixed the issue.

Reference: https://cwiki.apache.org/confluence/display/Hive/MultiDelimitSerDe

View solution in original post

3 REPLIES 3

avatar
Master Mentor

@G I

Is this "/tmp/b4cff5e9-2695-4658-b82c-798d6465227b_resources/hive-contrib-0.10.0.jar" the same jar that you got from the following location:

/usr/hdp/<version>/hive/lib/hive-contrib-<version>.jar

I means the version is correct? Which is shipped by the HDP? Or can you try the following once to make sure that the JAR that has the proper permission is loaded/added.

hive>add jar /usr/hdp/<version>/hive/lib/hive-contrib-<version>.jar;

Also you might want to try the following:

- On your Hive Server host create a directory "/usr/hdp/<version>/hive/auxlib"

- Now you should copy "/usr/hdp/<version>/hive/lib/hive-contrib-<version>.jar to "/usr/hdp/<version>/hive/auxlib"

- Then restart the HS2 server.

.

avatar

I have copied the jar correct file and tried .It didn't work. I haven't tried your suggestion of creating auxlib directory (due to access issue). I shall check and let you know. Thank you for the pointers.

avatar

@Jay SenSharma Thanks for your pointers regarding the JAR file. I have added hive-contrib.jar alone instead of hive-contrib-1.2.1.2.3.4.0-3485.jar and it fixed the issue.

Reference: https://cwiki.apache.org/confluence/display/Hive/MultiDelimitSerDe