Support Questions

Find answers, ask questions, and share your expertise

Who agreed with this topic

CDH5.0 VM: Error getting logs for job_1404657916663_0001

avatar
New Contributor

Hello,

I am using CDH 5.0 Quickstart VM for VirtualBox. I run a mapreduce job and it completes successfully. The reduce code has several System.out.println statements which should be logged in the job log. But when I open logs in Hue>Jobbrowser, I get the error message:

 

Error getting logs for job_1404657916663_0001

 

When I see the Hue logs in /var/log/hadoop-mapreduce/hadoop-cmf-yarn-JOBHISTORY-localhost.localdomain.log.out, I find exceptions like the one below:

 

2014-07-06 08:18:20,767 ERROR org.apache.hadoop.yarn.webapp.View: Error getting logs for job_1404659730687_0001
org.apache.hadoop.security.AccessControlException: Permission denied: user=mapred, access=EXECUTE, inode="/tmp/logs/cloudera/logs":cloudera:supergroup:drwxrwx---
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5461)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5443)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:5405)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1680)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1632)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1612)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1586)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:482)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:322)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)

 

So, I do:

hdfs dfs -chmod -R -777 /tmp/logs/cloudera

 

drwxrwxrwx   - cloudera supergroup          0 2014-07-06 08:23 /tmp/logs/cloudera/logs

 

Then I run the napreduce job again. I still get the same error in job browser.

The permissions to the log directory have reverted.

 

drwxrwx---   - cloudera supergroup          0 2014-07-06 08:23 /tmp/logs/cloudera/logs

 

What should be done to get the logs for the job?

 

Thanks in advance.

 

Lohith.

Who agreed with this topic