Support Questions

Find answers, ask questions, and share your expertise

location of hadoop and pig logs

avatar
Rising Star

My pig script failed when I am trying to do order by. I want to check the logs but I can't see the location of the logs. I got to know from the post that it should be set in the file (/etc/hadoop/hadoop-env.sh) but the file has everything commented as shown below. Does that mean it is not being written? how can I set those values to look into the logs as my scripts is failing again and again and I want to check the logs to troubleshoot it. also my file ( hadoop-env.sh)seems to be at different path(/etc/hadoop/conf.install) is there anything wrong ??

# On secure datanodes, user to run the datanode as after dropping privileges.

# This **MUST** be uncommented to enable secure HDFS if using privileged ports # to provide authentication of data transfer protocol. This **MUST NOT** be

# defined if SASL is configured for authentication of data transfer protocol # using non-privileged ports.

#export HADOOP_SECURE_DN_USER=${HADOOP_SECURE_DN_USER} # Where log files are stored. $HADOOP_HOME/logs by default. #export HADOOP_LOG_DIR=${HADOOP_LOG_DIR}/$USER

1 ACCEPTED SOLUTION

avatar

Hello Amit, As you mentioned that you "need to analyse map reduce logs in order to resolve the issue."

So did you check the following dir to find the mapreduce logs?

/var/log/hadoop-mapreduce/mapred/ (for History logs)

/var/log/hadoop-yarn/yarn (for hadoop-mapreduce.jobsummary.log)

View solution in original post

10 REPLIES 10

avatar
Master Mentor

Your pig job execution is set in pig.properties file. By default, your failed job log is saved in the same directory you're in. Log file is prefixed with pig_. You can override that in Ambari configs or passing arguments at the time you execute job. Review the pig.properties section of pig wiki and here's a sample pig properties file from trunk. https://svn.apache.org/repos/asf/pig/trunk/conf/pig.properties

https://pig.apache.org/docs/r0.15.0/start.html#properties

avatar
Rising Star

yes, you are right. It is creating logs into the the same file but it is when a pig script failing. It is not creating log where map reduce is failing. I need to analyse map reduce logs in order to resolve the issue.

avatar
Master Mentor

Understood, it was not clear from your question, to me it sounded more like configuration question.

avatar
Rising Star

Can you or someone help me with the main question, I am not able to figure out the reason for job failure. after googling it i think it may be related to the variable not set ..I am adding more details about how the variables are showing null... Could you please help..

avatar
Master Mentor

provide your script.

avatar

Hello Amit, As you mentioned that you "need to analyse map reduce logs in order to resolve the issue."

So did you check the following dir to find the mapreduce logs?

/var/log/hadoop-mapreduce/mapred/ (for History logs)

/var/log/hadoop-yarn/yarn (for hadoop-mapreduce.jobsummary.log)

avatar
Rising Star

Thanks this is what i was looking for.... Although this logs does not give much informaiton. Do you know if there is way to make log more readable.? or probably i'll have to ask another question.

avatar

If you want to debug the Jobs that are failing then you might enable the DEBUG something like following:

hadoop jar hadoop-mapreduce-examples.jar TestCount -Dyarn.app.mapreduce.am.log.level=TRACE -Dmapreduce.map.log.level=TRACE -Dmapreduce.reduce.log.level=TRACE

avatar
Rising Star
[hdfs@ bin]$ ls -lrt
total 32
-rwxr-xr-x. 1 root root 1857 Dec 15 22:25 rcc
-rwxr-xr-x. 1 root root 6594 Dec 15 22:25 hadoop.distro 
-rwxr-xr-x. 1 root root  775 Dec 15 22:26 yarn 
-rwxr-xr-x. 1 root root  782 Dec 15 22:26 mapred 
-rwxr-xr-x. 1 root root  775 Dec 15 22:26 hdfs 
-rwxr-xr-x. 1 root root  807 Dec 15 22:26 hadoop-fuse-dfs
-rwxr-xr-x. 1 root root  772 Apr  4 14:21 
hadoop
[hdfs@ bin]$ cat hadoop
#!/bin/bash
# Autodetect JAVA_HOME if not defined
if [ -e /usr/libexec/bigtop-detect-javahome ]; then
  . /usr/libexec/bigtop-detect-javahome
elif [ -e /usr/lib/bigtop-utils/bigtop-detect-javahome ]; then
  . /usr/lib/bigtop-utils/bigtop-detect-javahome
fi
export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/2.3.4.0-3485/hadoop}
export HADOOP_MAPRED_HOME=${HADOOP_MAPRED_HOME:-/usr/hdp/2.3.4.0-3485/hadoop-mapreduce}
export HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-/usr/hdp/2.3.4.0-3485/hadoop-yarn}
export HADOOP_LIBEXEC_DIR=${HADOOP_HOME}/libexec
export HDP_VERSION=${HDP_VERSION:-2.3.4.0-3485}
export HADOOP_OPTS="${HADOOP_OPTS} -Dhdp.version=${HDP_VERSION}"
export YARN_OPTS="${YARN_OPTS} -Dhdp.version=${HDP_VERSION}"
exec /usr/hdp/2.3.4.0-3485//hadoop/bin/hadoop.distro "$@"
[hdfs@ip- bin]$ echo $HADOOP_HOME
[hdfs@ip-bin]$ pwd
/usr/hdp/current/hadoop-client/bin
[hdfs@ip-bin]$