Member since
12-11-2015
67
Posts
10
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2531 | 05-11-2016 04:36 PM | |
3199 | 01-28-2016 10:42 PM |
01-05-2016
03:36 PM
I did as you said. Below is the changed workflow. <workflow-app xmlns="uri:oozie:workflow:0.3" name="pdr-distcp-wf">
<start to="distcp-node"/>
<action name="distcp-node">
<distcp xmlns="uri:oozie:distcp-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode1}</name-node>
<arg>${SourceDir}</arg>
<arg>${TargetDir}</arg>
<configuration>
<property>
<name>oozie.launcher.mapreduce.job.hdfs-servers</name>
<value>${nameNode1},${nameNode2}</value>
</property>
</configuration>
<arg>${SourceDir}</arg>
<arg>${TargetDir}</arg>
</distcp>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> I still see the below error Error: Invalid app definition, org.xml.sax.SAXParseException; lineNumber: 9; columnNumber: 19; cvc-complex-type.2.4.a: Invalid content was found starting with element 'configuration'. One of '{"uri:oozie:distcp-action:0.1":arg}' is expected. Please advice Thanks, Venkat
... View more
12-29-2015
07:10 PM
I changed the workflow as you suggested <workflow-app xmlns="uri:oozie:workflow:0.3" name="pdr-distcp-wf">
<start to="distcp-node"/>
<action name="distcp-node">
<distcp xmlns="uri:oozie:distcp-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode1}</name-node>
<arg>${SourceDir}</arg>
<arg>${TargetDir}</arg>
<configuration>
<property>
<name>oozie.launcher.mapreduce.job.hdfs-servers</name>
<value>${nameNode1},${nameNode2}</value>
</property>
</configuration>
<arg>${SourceDir}</arg>
<arg>${TargetDir}</arg>
</distcp>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> But still the same error Error: Invalid app definition, org.xml.sax.SAXParseException; lineNumber: 9; columnNumber: 19; cvc-complex-type.2.4.a: Invalid content was found starting with element 'configuration'. One of '{"uri:oozie:distcp-action:0.1":arg}' is expected
... View more
12-29-2015
03:54 PM
Okay...Below is my workflow where I have the distcp block closed as well <distcp xmlns="uri:oozie:distcp-action:0.2"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode1}</name-node> <arg>${SourceDir}</arg> <arg>${TargetDir}</arg> <configuration> <property> <name>oozie.launcher.mapreduce.job.hdfs-servers</name> <value>${nameNode1},${nameNode2}</value> </property> </configuration> </distcp> I still get the same error Error: Invalid app definition, org.xml.sax.SAXParseException; lineNumber: 9; columnNumber: 19; cvc-complex-type.2.4.a: Invalid content was found starting with element 'configuration'. One of '{"uri:oozie:distcp-action:0.2":arg}' is expected.
... View more
12-24-2015
06:14 PM
I closed it. You can see it below to </configuration>
... View more
12-18-2015
04:47 PM
2 Kudos
Hi, I am trying to submit Oozie workflow with distcp-action but getting below error when I validate the workflow oozie validate pdr-distcp-wf.xml
Error: Invalid app definition, org.xml.sax.SAXParseException; lineNumber: 9; columnNumber: 20; cvc-complex-type.2.4.a: Invalid content was found starting with element 'configuration'. One of '{"uri:oozie:distcp-action:0.2":arg}' is expected. Please find the workflow that I am using below.... <workflow-app xmlns="uri:oozie:workflow:0.2" name="pdr-distcp-wf">
<start to="distcp-node"/>
<action name="distcp-node">
<distcp xmlns="uri:oozie:distcp-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode1}</name-node>
<arg>${SourceDir}</arg>
<arg>${TargetDir}</arg>
<configuration>
<property>
<name>oozie.launcher.mapreduce.job.hdfs-servers</name>
<value>${nameNode1},${nameNode2}</value>
</property>
</configuration>
</distcp>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> Please help me out
... View more
Labels:
- Labels:
-
Apache Oozie
12-18-2015
03:58 PM
I think the problems is, though I gave 777 over /apps/falcon/backupCluster/staging, when I tried creating directory under /apps/falcon/backupCluster/staging/falcon/workflows/feed as falcon user, it is created with 755 permissions as below [falcon@hostname ~]$ hdfs dfs -mkdir /apps/falcon/backupCluster/staging/falcon/workflows/feed/test1 hdfs dfs -ls /apps/falcon/backupCluster/staging/falcon/workflows/feed drwxr-xr-x - falcon hdfs 0 2015-12-18 09:55 /apps/falcon/backupCluster/staging/falcon/workflows/feed/test1 The directory is not created with 777 permission (umask 022). Do you think it could be the reason?
... View more
12-18-2015
03:40 PM
Hey Balu, Thanks for quick responses on the issue. I really appreciate it. I gave recursive 777 permissions to the entire directory. drwxrwxrwx - falcon hdfs 0 2015-12-10 15:42 /apps/falcon/backupCluster/staging I still see the below error
... View more
12-16-2015
06:17 PM
T//entities/schedule/feed/hdfspdrrep] ~ Unable to schedule workflow (AbstractSchedulableEntityManager:66)
org.apache.falcon.FalconException: Error preparing base staging dirs: /apps/falcon/backupCluster/staging/falcon/workflows/feed/hdfspdrrep
at org.apache.falcon.workflow.engine.OozieWorkflowEngine.prepareEntityBuildPath(OozieWorkflowEngine.java:185)
at org.apache.falcon.workflow.engine.OozieWorkflowEngine.schedule(OozieWorkflowEngine.java:153)
at org.apache.falcon.resource.AbstractSchedulableEntityManager.scheduleInternal(AbstractSchedulableEntityManager.java:76)
at org.apache.falcon.resource.AbstractSchedulableEntityManager.schedule(AbstractSchedulableEntityManager.java:63)
at org.apache.falcon.resource.SchedulableEntityManager.schedule(SchedulableEntityManager.java:116)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=hdpexeusr, access=WRITE, inode="/apps/falcon/backupCluster/staging/falcon/workflows/feed":falcon:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:238)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:179)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6795)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6777)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6729)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4495)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4465)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4438)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:830)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:614) ....................... ... 61 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=hdpexeusr, access=WRITE, inode="/apps/falcon/backupCluster/staging/falcon/workflows/feed":falcon:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:238)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:179)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6795)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6777)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6729)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4495)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4465)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4438)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:830)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:614) ...................... 2015-12-11 14:22:12,668 ERROR - [1088325169@qtp-1225320031-7009:717f3588-92ea-412b-a666-698ac323b961 hdpexeusr:POST//entities/schedule/feed/hdfspdrrep] ~ Action failed: Bad Request
Error: Error preparing base staging dirs: /apps/falcon/backupCluster/staging/falcon/workflows/feed/hdfspdrrep (FalconWebException:68)
2015-12-11 14:22:12,669 INFO - [1088325169@qtp-1225320031-7009:717f3588-92ea-412b-a666-698ac323b961 hdpexeusr:POST//entities/schedule/feed/hdfspdrrep] ~ {Action:schedule, Dimensions:{entityType=feed, colo=*, entityName=hdfspdrrep}, Status: FAILED, Time-taken:232141743 ns} (METRIC:38)
... View more
12-14-2015
03:42 PM
Thank you for your quick response. I am seeing that error even after giving 777 permission to all directories within staging directory for both primaryCluster and backupCluster.
... View more
12-11-2015
09:36 PM
Hi, We are using falcon-0.6. I am trying to submit Falcon feed schedule as user1 but it throws below error Permission denied: user=user1, access=WRITE, inode="/apps/falcon/backupCluster/staging/falcon/workflows/feed":falcon:hdfs:drwxr-xr-x I gave 777 permissions to staging directory but I still see the same error. I am able to submit the schedule as falcon user but could not as any other user. Please help me with this Thanks, Venkat
... View more
Labels:
- Labels:
-
Apache Falcon
- « Previous
- Next »