Member since
09-29-2015
67
Posts
115
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1210 | 01-05-2016 05:03 PM | |
1850 | 12-31-2015 07:02 PM | |
1710 | 11-04-2015 03:38 PM | |
2139 | 10-19-2015 01:42 AM | |
1165 | 10-15-2015 02:22 PM |
01-13-2016
07:11 PM
That worked!
... View more
05-24-2016
11:21 PM
I ended up storing the file in HDFS and reading it through sc.textFile(args[0])
... View more
01-08-2016
03:41 PM
More example included in original answer, showing a more efficient method. Aggregation per column probably really a different question. Feel free to ask it if you still need!
... View more
01-05-2016
02:21 AM
please accept the answer to close the thread.
... View more
02-02-2016
03:00 PM
4 Kudos
I created a simple archetype to solve 1-4. It can be found at the link below. https://github.com/jjmeyer0/spark-java-archetype.git
... View more
01-01-2016
02:09 AM
Good call Vladimir. mkdir: Permission denied: user=yarn, access=WRITE, inode="/user/ambari-qa/falcon/demo/primary/input/enron/2015-12-30-01":ambari-qa:hdfs:drwxr-xr-x
I executed the job from falcon using ambari-qa. Is there any configuration I can change so it uses the user ambari-qa during execution?
... View more
02-17-2016
12:02 PM
1 Kudo
Thanks !!!
... View more
12-28-2015
12:57 AM
Andrew is correct in regards to the difference between ${uuid} and ${UUID()} but the reason why ${uuid} doesn't work is because the GetHTTP processor doesn't evaluate the expression language in relation to the new flowfile.
As you can see on this line: https://github.com/apache/nifi/blob/528dab78d6dbc3bb4f61663f423b07284936ec40/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/GetHTTP.java#L476
It calls "evaluateAttributeExpressions()" with no parameters so the "uuid" of the new flowfile is outside the scope of the expression language when being evaluated. If you want to re-use the UUID for the FlowFile you can use an UpdateAttribute processor to append the UUID to "./downloads/".
... View more
11-18-2015
11:59 PM
You should ALWAYS depend on HDP version of the artifacts instead of the Apache version of the artifacts coming from the HDP maven repositories. The dependencies that is resolved from the maven artifacts will be correctly resolved to the exact same version of the artifacts. Remember that, the base versions od HDP client and server jars are only indicative of what commits are in those binaries. However, HDP version of the client jars might contain other fixes that is not available in that particular apache base version. See Item (6) in http://community.hortonworks.com/articles/4091/hbase-client-application-best-practices.html
... View more
- « Previous
-
- 1
- 2
- Next »