Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Error after running oozie shell action with oozie user

Highlighted

Error after running oozie shell action with oozie user

Super Collaborator

Hi:

After run this:

oozie job http://hdp-oozie01:11000/oozie -config /var/lib/hadoop-hdfs/test/job.properties –auth simple -run

the properties and xml are:

nameNode=hdfs://hdp-spark-master01:8020
jobTracker=hdp-spark-master01:8032
queueName=default
examplesRoot=examplesoozie
oozie.wf.application.path=hdfs://hdp-spark-master01/user/oozie/examples/test/workflow.xml
outputDir=map-reduce
wf.user=oozie
myscript=myscript.sh
myscriptPath=hdfs://hdp-spark-master01/user/oozie/examples/test/myscript.sh
oozie.use.system.libpath=true
oozie.libpath=${nameNode}/user/oozie/share/lib


<workflow-app name="shell-wf"><start to="shell-node"/><action name="shell-node"><shell><job-tracker>${jobTracker}</job-tracker><name-node>${nameNode}</name-node><configuration><property><name>mapred.job.queue.name</name><value>${queueName}</value></property></configuration><exec>${myscript}</exec><env-var>HADOOP_USER_NAME=${wf:user()}</env-var><file>${myscriptPath}</file><capture-output/></shell><ok to="end"/><error to="fail"/></action><kill name="fail"><message>Shell action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message></kill><kill name="fail-output"><message>Incorrect output, expected [Hello Oozie] but was [${wf:actionData('shell-node')['my_output']}]</message></kill><end name="end"/></workflow-app>

the error is:

Guessed logs' owner is oozie and current user root does not have permission to access /app-logs/oozie/logs/application_1504172297013_0001. Error message found: Permission denied: user=root, access=EXECUTE, inode="/app-logs/oozie/logs/application_1504172297013_0001":oozie:hdfs:drwxrwx---


	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1955)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1939)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1913)
	at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getListingInt(FSDirStatAndListingOp.java:77)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:4780)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:1124)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:645)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2351)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2347)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2345)

any suggestion?

Don't have an account?
Coming from Hortonworks? Activate your account here