Member since
09-29-2015
56
Posts
8
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2559 | 06-08-2016 02:49 PM | |
1390 | 04-14-2016 11:14 PM | |
4469 | 01-20-2016 03:22 AM | |
2150 | 12-21-2015 09:05 PM | |
2555 | 12-15-2015 10:55 PM |
01-22-2016
08:38 PM
@bsaini @zblanco @Rafael Coss @Balu I was able to run through the tutorial on my own built machine with HDP 2.3.4, albeit doing something wrong with paths, it works. Granted I was using the latest HDP 2.3 tutorial https://github.com/ZacBlanco/hwx-tutorials/blob/master/2-3/tutorials/define-and-process-data-pipelines-with-falcon/define-and-process-data-pipelines-with-apache-falcon.md where there are no CLI commands for falcon.
... View more
12-22-2015
09:25 AM
I am also facing the same problem i have tried the below Disable proxy and connectting oozie from curl command it works fine . I am still getting the invalid oozie server or port . .And my hivemetastore is not working .But i have ignored it can that be of any way a problem in submitting falcon entity
... View more
11-28-2017
09:10 AM
Hi Team, I have tried above and I see the Job status KILLED after running the workflow. After launching Oozie, I can see the workflow changing status from RUNNING to KILLED. Is there a way to troubleshoot. I can run hadoop fs -ls commands on my s3 bucket so definitely got access. I suspect its the s3 URL. I tried downloading the xml changing the URL and uploading with no luck. Any other suggestions. Appreciate all your help/support in advance. Regards Anil
... View more