Support Questions

Find answers, ask questions, and share your expertise

Launcher ERROR - Pig.Main

avatar
Contributor

I am trying to run a Oozie job that runs a pig script. My job.properties file is local and my workflow.xml is in HDFS. Below is my properties file.

nameNode=hdfs://192.168.56.104:8020 jobTracker=192.168.56.104:8050 queueName=default flattenRoot=Flatten_Tweet2 oozie.use.system.libpath=true oozie.wf.application.path=${nameNode}/user/${user.name}/oozie/${flattenRoot} oozie.action.sharelib.for.pig=pig,hcatalog,hive

and here is my workflow file.

<workflow-app name='pig-wf' xmlns="uri:oozie:workflow:0.3"> <start to='pig-node'/> <action name='pig-node'> <pig> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> </configuration> <script>Flatten_Tweet2.pig</script> <file>lib/hive-site.xml</file> </pig> <ok to="end"/> <error to="fail"/> </action> <kill name="fail"> <message>Pig failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end"/> </workflow-app>

I keep getting the below error no matter what I try.

Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.PigMain], exit code [2]

Any help would be apprecated. Thanks

1 ACCEPTED SOLUTION

avatar
Contributor

After a lot of trial and error, reading documentation and beers. I was able to figure out that the issue was the name of my pig jar. I renamed it to pig.jar in the lib folder, backed out all changes to my pig-env,sh and my oozie job ran and succeeded. Thank you to everyone for your help and especially to Jitendra for taking the time to give suggestions.

View solution in original post

16 REPLIES 16

avatar
Master Guru
@Montrial Harrell

Can you please check oozie launcher logs? Please refer this article.

You can post logs here so that we can have a look at it, most probably "stderr" part of launcher logs should give you a hint.

avatar
Contributor

I have took a look at the stderr logs and got this as an output. Does this make sense?

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/23/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/44/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Details at logfile: /hadoop/yarn/local/usercache/mbharrel/appcache/application_1463627394303_0038/container_e16_1463627394303_0038_01_000002/pig-job_1463627394303_0038.log

avatar
Contributor

Actually that is not the issue. When I looked down in the log I found the below. I will investigate but appreciate any guidance as well.

ERROR 4010: Cannot find hadoop configurations in classpath (neither hadoop-site.xml nor core-site.xml was found in the classpath).

avatar
Super Guru

@Montrial Harrell

Seems like pig doesn't know where to get conf files, Can you please set below env properties inside pig-env.sh and run again?

export HADOOP_CONF_DIR=$HADOOP_CONF_DIR:/etc/hadoop/conf

export PIG_CLASSPATH=$PIG_CLASSPATH:$HADOOP_CONF_DIR

Also please let us know your hdp version and did you changed any properties in cluster recently? I think this shouldn't be default behavior.

avatar
Super Guru

@Montrial Harrell

Thanks for sharing that info, can you please run below command and see which pig jar having these conf files.

grep -iR hadoop-site.xml /usr/hdp/<version>/pig/

avatar
Super Guru

@Montrial Harrell

Please check for core-site.xml also.

grep -iR core-site.xml /usr/hdp/<version>/pig/

If you don't see any jar file in the output then I think you can try copying these two xml from /etc/hadoop/conf to pig/conf dir and see if that resolve the issue.

avatar
Contributor

I did not get any pig jars when I ran grep -iR core-site.xml /usr/hdp/2.4.0.0-169/pig. The only output was the below. Should I copy the xml files to the pig/conf as you instructed?

/usr/hdp/2.4.0.0-169/pig/doc/api/constant-values.html:<td><code>"core-site.xml"</code></td> /usr/hdp/2.4.0.0-169/pig/conf/pig.properties:# By default, Pig expects hadoop configs (hadoop-site.xml and core-site.xml) /usr/hdp/2.4.0.0-169/pig/CHANGES.txt:PIG-4247: S3 properties are not picked up from core-site.xml in local mode (cheolsoo) /usr/hdp/2.4.0.0-169/pig/CHANGES.txt:PIG-3145: Parameters in core-site.xml and mapred-site.xml are not correctly substituted (cheolsoo)

avatar
Super Guru

@Montrial Harrell

Yes, please try.

avatar
Super Guru

@Montrial Harrell

Yes, thats fine.