Created 04-03-2016 01:58 AM
My pig script failed when I am trying to do order by. I want to check the logs but I can't see the location of the logs. I got to know from the post that it should be set in the file (/etc/hadoop/hadoop-env.sh) but the file has everything commented as shown below. Does that mean it is not being written? how can I set those values to look into the logs as my scripts is failing again and again and I want to check the logs to troubleshoot it. also my file ( hadoop-env.sh)seems to be at different path(/etc/hadoop/conf.install) is there anything wrong ??
# On secure datanodes, user to run the datanode as after dropping privileges.
# This **MUST** be uncommented to enable secure HDFS if using privileged ports # to provide authentication of data transfer protocol. This **MUST NOT** be
# defined if SASL is configured for authentication of data transfer protocol # using non-privileged ports.
#export HADOOP_SECURE_DN_USER=${HADOOP_SECURE_DN_USER} # Where log files are stored. $HADOOP_HOME/logs by default. #export HADOOP_LOG_DIR=${HADOOP_LOG_DIR}/$USER
Created 04-03-2016 04:55 AM
Hello Amit, As you mentioned that you "need to analyse map reduce logs in order to resolve the issue."
So did you check the following dir to find the mapreduce logs?
/var/log/hadoop-mapreduce/mapred/ (for History logs)
/var/log/hadoop-yarn/yarn (for hadoop-mapreduce.jobsummary.log)
Created 04-03-2016 02:24 AM
Your pig job execution is set in pig.properties file. By default, your failed job log is saved in the same directory you're in. Log file is prefixed with pig_. You can override that in Ambari configs or passing arguments at the time you execute job. Review the pig.properties section of pig wiki and here's a sample pig properties file from trunk. https://svn.apache.org/repos/asf/pig/trunk/conf/pig.properties
Created 04-03-2016 03:25 AM
yes, you are right. It is creating logs into the the same file but it is when a pig script failing. It is not creating log where map reduce is failing. I need to analyse map reduce logs in order to resolve the issue.
Created 04-03-2016 11:19 AM
Understood, it was not clear from your question, to me it sounded more like configuration question.
Created 04-04-2016 06:28 PM
Can you or someone help me with the main question, I am not able to figure out the reason for job failure. after googling it i think it may be related to the variable not set ..I am adding more details about how the variables are showing null... Could you please help..
Created 04-04-2016 06:42 PM
provide your script.
Created 04-03-2016 04:55 AM
Hello Amit, As you mentioned that you "need to analyse map reduce logs in order to resolve the issue."
So did you check the following dir to find the mapreduce logs?
/var/log/hadoop-mapreduce/mapred/ (for History logs)
/var/log/hadoop-yarn/yarn (for hadoop-mapreduce.jobsummary.log)
Created 04-03-2016 06:15 AM
Thanks this is what i was looking for.... Although this logs does not give much informaiton. Do you know if there is way to make log more readable.? or probably i'll have to ask another question.
Created 04-03-2016 06:33 AM
If you want to debug the Jobs that are failing then you might enable the DEBUG something like following:
hadoop jar hadoop-mapreduce-examples.jar TestCount -Dyarn.app.mapreduce.am.log.level=TRACE -Dmapreduce.map.log.level=TRACE -Dmapreduce.reduce.log.level=TRACE
Created 04-04-2016 06:29 PM
[hdfs@ bin]$ ls -lrt total 32 -rwxr-xr-x. 1 root root 1857 Dec 15 22:25 rcc -rwxr-xr-x. 1 root root 6594 Dec 15 22:25 hadoop.distro -rwxr-xr-x. 1 root root 775 Dec 15 22:26 yarn -rwxr-xr-x. 1 root root 782 Dec 15 22:26 mapred -rwxr-xr-x. 1 root root 775 Dec 15 22:26 hdfs -rwxr-xr-x. 1 root root 807 Dec 15 22:26 hadoop-fuse-dfs -rwxr-xr-x. 1 root root 772 Apr 4 14:21
hadoop [hdfs@ bin]$ cat hadoop #!/bin/bash # Autodetect JAVA_HOME if not defined if [ -e /usr/libexec/bigtop-detect-javahome ]; then . /usr/libexec/bigtop-detect-javahome elif [ -e /usr/lib/bigtop-utils/bigtop-detect-javahome ]; then . /usr/lib/bigtop-utils/bigtop-detect-javahome fi export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/2.3.4.0-3485/hadoop} export HADOOP_MAPRED_HOME=${HADOOP_MAPRED_HOME:-/usr/hdp/2.3.4.0-3485/hadoop-mapreduce} export HADOOP_YARN_HOME=${HADOOP_YARN_HOME:-/usr/hdp/2.3.4.0-3485/hadoop-yarn} export HADOOP_LIBEXEC_DIR=${HADOOP_HOME}/libexec export HDP_VERSION=${HDP_VERSION:-2.3.4.0-3485} export HADOOP_OPTS="${HADOOP_OPTS} -Dhdp.version=${HDP_VERSION}" export YARN_OPTS="${YARN_OPTS} -Dhdp.version=${HDP_VERSION}" exec /usr/hdp/2.3.4.0-3485//hadoop/bin/hadoop.distro "$@"
[hdfs@ip- bin]$ echo $HADOOP_HOME [hdfs@ip-bin]$ pwd /usr/hdp/current/hadoop-client/bin [hdfs@ip-bin]$