My pig script failed when I am trying to do order by. I want to check the logs but I can't see the location of the logs. I got to know from the post that it should be set in the file (/etc/hadoop/hadoop-env.sh) but the file has everything commented as shown below. Does that mean it is not being written? how can I set those values to look into the logs as my scripts is failing again and again and I want to check the logs to troubleshoot it.
also my file ( hadoop-env.sh)seems to be at different path(/etc/hadoop/conf.install) is there anything wrong ??
# On secure datanodes, user to run the datanode as after dropping privileges.
# This **MUST** be uncommented to enable secure HDFS if using privileged ports
# to provide authentication of data transfer protocol. This **MUST NOT** be
# defined if SASL is configured for authentication of data transfer protocol
# using non-privileged ports.
#export HADOOP_SECURE_DN_USER=${HADOOP_SECURE_DN_USER}
# Where log files are stored. $HADOOP_HOME/logs by default.
#export HADOOP_LOG_DIR=${HADOOP_LOG_DIR}/$USER