Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

pig and hive

Super Collaborator

Hi:

After do this :

egister /usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core-1.2.1.2.3.2.0-2950.jar;
register /usr/lib/piggybank/hcatalog-pig-adapter-0.11.0.jar;
register /usr/lib/piggybank/hcatalog-core-0.11.0.jar;
register /usr/lib/piggybank/hive-shims-0.11.0.jar;
.
.
.
 F = LOAD 'journey_pig' USING org.apache.hive.hcatalog.pig.HCatLoader();

***The hive-shims-0.11.0.jar contain this classs like this:

public abstract UserGroupInformation getUGIForConf(Configuration paramConfiguration)
    throws LoginException, IOException;

i receive this error:

2016-02-15 12:54:42,359 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2998: Unhandled internal error. org.apache.hadoop.hive.shims.HadoopShims.getUGIForConf(Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/hadoop/security/UserGroupInformation;
2016-02-15 12:54:42,359 [main] ERROR org.apache.pig.tools.grunt.Grunt - java.lang.NoSuchMethodError: org.apache.hadoop.hive.shims.HadoopShims.getUGIForConf(Lorg/apache/hadoop/conf/Configuration;)Lorg/apache/hadoop/security/UserGroupInformation;
        at org.apache.hcatalog.common.HiveClientCache$HiveClientCacheKey.<init>(HiveClientCache.java:201)
        at org.apache.hcatalog.common.HiveClientCache$HiveClientCacheKey.fromHiveConf(HiveClientCache.java:207)
        at org.apache.hcatalog.common.HiveClientCache.get(HiveClientCache.java:138)
        at org.apache.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:544)
        at org.apache.hcatalog.pig.PigHCatUtil.getHiveMetaClient(PigHCatUtil.java:147)
        at org.apache.hcatalog.pig.PigHCatUtil.getTable(PigHCatUtil.java:183)
        at org.apache.hcatalog.pig.HCatLoader.getSchema(HCatLoader.java:193)
        at org.apache.pig.newplan.logical.relational.LOLoad.getSchemaFromMetaData(LOLoad.java:175)
        at org.apache.pig.newplan.logical.relational.LOLoad.<init>(LOLoad.java:89)
        at org.apache.pig.parser.LogicalPlanBuilder.buildLoadOp(LogicalPlanBuilder.java:901)
        at org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3568)
        at org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1625)
        at org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:1102)
        at org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:560)
        at org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:421)
        at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:191)
        at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1735)
        at org.apache.pig.PigServer$Graph.access$000(PigServer.java:1443)
        at org.apache.pig.PigServer.parseAndBuild(PigServer.java:387)
        at org.apache.pig.PigServer.executeBatch(PigServer.java:412)
        at org.apache.pig.PigServer.executeBatch(PigServer.java:398)
        at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:171)
        at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:749)
        at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:376)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:230)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
        at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
        at org.apache.pig.Main.run(Main.java:502)
        at org.apache.pig.Main.main(Main.java:177)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Please any suggestions??

Many thanks

1 ACCEPTED SOLUTION

Mentor

@Roberto Sancho please use Hive 1.2.1 and pig 0.15 jars.

This script uses Hive, pig and hcatalog and tested on latest sandbox https://github.com/dbist/oozie/tree/master/apps/hcatalog also execute pig with -use hcatalog switch

View solution in original post

6 REPLIES 6

Mentor

@Roberto Sancho please use Hive 1.2.1 and pig 0.15 jars.

This script uses Hive, pig and hcatalog and tested on latest sandbox https://github.com/dbist/oozie/tree/master/apps/hcatalog also execute pig with -use hcatalog switch

@Roberto Sancho

Please see this tutorial http://hortonworks.com/hadoop-tutorial/how-to-use-hcatalog-basic-pig-hive-commands/

You can follow it step by step to make this work.

Super Collaborator

Many thanks, the pi version is :

Pig0.15.0.2.3

and the Hive version is:

Hive1.2.1.2

This is correct???

Thanks.

Mentor

Yes just look at my script @Roberto Sancho

Mentor

@Roberto Sancho as you see in my script I am not registering hive jars, just make sure you use -useHCatalog. I had same issue in this thread https://community.hortonworks.com/content/idea/2391/pig-and-hive-actions-should-include-hive-hcatalo...

also in your original script, you're mixing hive and pig 0.11 with hive 1.2.1. That's why I said use latest jars. If you still need to register, go to /usr/hdp/current/pig-client for pig jars and /usr/hdp/current/hive for hive jars. Remove 0.11 jars completely. But in my script I don't even register any pig or hive jars, just fix your classpath.

Super Collaborator

Hi:

it work with me like this, many thanks 🙂

register /usr/lib/piggybank/hive-hcatalog-pig-adapter.jar
register /usr/lib/piggybank/hive-common.jar;
register /usr/lib/piggybank/hive-metastore.jar;
register /usr/lib/piggybank/hive-exec.jar;
register /usr/lib/piggybank/hive-serde.jar;
register /usr/lib/piggybank/hive-shims.jar;
register /usr/lib/piggybank/libfb303.jar
Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.