Member since
04-23-2017
30
Posts
3
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5568 | 06-11-2016 04:19 AM | |
5817 | 05-22-2016 05:51 AM |
05-21-2016
11:56 PM
The job still fails with the same error in the log oozie log. Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.PigMain]
... View more
05-21-2016
11:30 PM
I have copied both files from /etc/hadoop/conf to /usr/hdp/2.4.0.0-169/pig/conf and chmod to 755. Is this correct?
... View more
05-21-2016
11:03 PM
I did not get any pig jars when I ran grep -iR core-site.xml /usr/hdp/2.4.0.0-169/pig. The only output was the below. Should I copy the xml files to the pig/conf as you instructed? /usr/hdp/2.4.0.0-169/pig/doc/api/constant-values.html:<td><code>"core-site.xml"</code></td>
/usr/hdp/2.4.0.0-169/pig/conf/pig.properties:# By default, Pig expects hadoop configs (hadoop-site.xml and core-site.xml)
/usr/hdp/2.4.0.0-169/pig/CHANGES.txt:PIG-4247: S3 properties are not picked up from core-site.xml in local mode (cheolsoo)
/usr/hdp/2.4.0.0-169/pig/CHANGES.txt:PIG-3145: Parameters in core-site.xml and mapred-site.xml are not correctly substituted (cheolsoo)
... View more
05-21-2016
10:51 PM
Here is what my new pig-env.sh looks like. JAVA_HOME=/usr/lib/jvm/java
HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/current/hadoop-client}
export HADOOP_CONF_DIR=$HADOOP_CONF_DIR:/etc/hadoop/conf
export PIG_CLASSPATH=$PIG_CLASSPATH:$HADOOP_CONF_DIR
if [ -d "/usr/lib/tez" ]; then
PIG_OPTS="$PIG_OPTS -Dmapreduce.framework.name=yarn"
fi
... View more
05-21-2016
10:46 PM
Here is the output Jitendra. Do you see anything wrong? /usr/hdp/2.4.0.0-169/pig/doc/api/constant-values.html:<td><code>"hadoop-site.xml"</code></td>
/usr/hdp/2.4.0.0-169/pig/conf/pig.properties:# By default, Pig expects hadoop configs (hadoop-site.xml and core-site.xml)
/usr/hdp/2.4.0.0-169/pig/CHANGES.txt:PIG-3200: MiniCluster should delete hadoop-site.xml on shutDown (prkommireddi via cheolsoo)
/usr/hdp/2.4.0.0-169/pig/CHANGES.txt:PIG-2491: Pig docs still mention hadoop-site.xml (daijy)
/usr/hdp/2.4.0.0-169/pig/CHANGES.txt:PIG-1791: System property mapred.output.compress, but pig-cluster-hadoop-site.xml doesn't (daijy)
/usr/hdp/2.4.0.0-169/pig/CHANGES.txt:PIG-1186: Pig do not take values in "pig-cluster-hadoop-site.xml" (daijy)
/usr/hdp/2.4.0.0-169/pig/CHANGES.txt: 'pig-cluster-hadoop-site.xml' in the non HOD case just like it does in the
/usr/hdp/2.4.0.0-169/pig/RELEASE_NOTES.txt:variable to point to the directory with your hadoop-site.xml file and then run
... View more
05-21-2016
10:28 PM
I am running the HDP Sandbox. My hadoop version command returns the below and I have not changed anything to my recollection about the environment. hadoop version
Hadoop 2.7.1.2.4.0.0-169
Subversion git@github.com:hortonworks/hadoop.git -r 26104d8ac833884c8776473823007f176854f2eb
Compiled by jenkins on 2016-02-10T06:18Z
Compiled with protoc 2.5.0
From source with checksum cf48a4c63aaec76a714c1897e2ba8be6
This command was run using /usr/hdp/2.4.0.0-169/hadoop/hadoop-common-2.7.1.2.4.0.0-169.jar
... View more
05-21-2016
09:48 PM
Actually that is not the issue. When I looked down in the log I found the below. I will investigate but appreciate any guidance as well. ERROR 4010: Cannot find hadoop configurations in classpath (neither hadoop-site.xml nor core-site.xml was found in the classpath).
... View more
05-21-2016
09:29 PM
I have took a look at the stderr logs and got this as an output. Does this make sense? SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/23/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hadoop/yarn/local/filecache/44/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Details at logfile: /hadoop/yarn/local/usercache/mbharrel/appcache/application_1463627394303_0038/container_e16_1463627394303_0038_01_000002/pig-job_1463627394303_0038.log
... View more
05-21-2016
04:33 AM
I am trying to run a Oozie job that runs a pig script. My job.properties file is local and my workflow.xml is in HDFS. Below is my properties file. nameNode=hdfs://192.168.56.104:8020
jobTracker=192.168.56.104:8050
queueName=default
flattenRoot=Flatten_Tweet2
oozie.use.system.libpath=true
oozie.wf.application.path=${nameNode}/user/${user.name}/oozie/${flattenRoot}
oozie.action.sharelib.for.pig=pig,hcatalog,hive and here is my workflow file. <workflow-app name='pig-wf' xmlns="uri:oozie:workflow:0.3">
<start to='pig-node'/>
<action name='pig-node'>
<pig>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<script>Flatten_Tweet2.pig</script>
<file>lib/hive-site.xml</file>
</pig>
<ok to="end"/>
<error to="fail"/>
</action>
<kill name="fail">
<message>Pig failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> I keep getting the below error no matter what I try. Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.PigMain], exit code [2] Any help would be apprecated. Thanks
... View more
Labels:
- Labels:
-
Apache Oozie
05-19-2016
03:20 AM
The changing of the ulimit appears to have fixed that issue. I am now presented with the below issue but at least it is something different. Thank you Xi for the help.
... View more
- « Previous
-
- 1
- 2
- Next »