Member since
09-28-2015
24
Posts
19
Kudos Received
6
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3495 | 06-13-2016 04:15 PM | |
1338 | 06-13-2016 10:54 AM | |
1017 | 06-12-2016 01:26 PM | |
1563 | 06-12-2016 03:44 AM | |
3977 | 02-03-2016 08:55 PM |
09-18-2019
06:55 AM
@myoung can you please give the syntax to write this sort of query
... View more
09-16-2016
03:30 AM
Great article!
... View more
09-11-2016
07:34 PM
Thanks @Bryan Bende, I think this is a great solution. I'm currently using two attributes to help create directory structures dynamically in HDFS: /tmp/data_staging/${SourceAttribute}/${data.date}
I was planning to get around the lack of multiple correlation attribute support by doing a route on attribute processor to different merge content processors then back to a single HDFS processor using the original attributes. With your suggestion I can do a much cleaner and more flexible solution. After merging the two attributes with an UpdateAttribute processor, I'll send the data to a MergeContent processor where I'll bin the files on the new combined attribute, and then to my putHDFS processor. The merged attribute will persist (e.g., dataSource1_20160911), which I can then do something like the following to continue dynamically creating directories: /tmp/data_staging/${convergedSourceDateAttribute:substringBefore('_')}/${convergedSourceDateAttribute:substringAfter('_')}
Does that seem reasonable?
... View more
08-10-2016
11:43 PM
7 Kudos
The Apache Nifi community recently released the beta version of Apache Nifi 1.0.0. This version comes with significant updates, which include a UI refresh, transition to zero master clustering, added multi-tenant authorization, and templates that are now deterministically ordered allowing for version controlled templates! The beta also boasts nine new processors, bringing the total to 165. The full list of release notes can be found
here.
Below I will do a very basic walkthrough of some of the UI changes from 0.7.0 to the 1.0.0 beta:
Original Apache Nifi 0.7.0 Flow
New Apache Nifi 1.0.0 Beta
Outside of the more modernized look and feel, there are some key UI Changes:
(1) Apache Nifi 1.0.0 beta now includes a status bar showing statistics of the overall flow, including bytes in and out, number of started and stopped processors, processors in error, last refresh, etc. (2) The beta also has new collapsable navigation and operation panes
(3) There's a new drop down menu where you can access information like flow summary, provenance, and the bulletin board (e.g., error messages).
(4) The search field is much more prominent now and allows users to search through complex flows to quickly find and jump-to processor and other elements on the flow
... View more
Labels:
10-17-2017
11:42 AM
Was anybody ever able to get the Spark jobs to show on HDP 2.6?
... View more
06-12-2016
01:26 PM
@Micky Woo the article you're pointing to is demonstrating the tag based policy ranger technical preview. While it's not GA in the Apache Ranger .5x line, it is targeted as a development theme for .6x: https://cwiki.apache.org/confluence/display/RANGER/Tag+Based+Policies
... View more
09-16-2016
05:44 AM
Hi Manoj,
I visited your blog and the information that you provide in a blog is so impressive.
I agree with your point, the maximum use of solar energy is good for the planet as well for individuals. I also started a solar revolution to save the Planet. More information here https://powur.com/jose.rosa/learn
Thanks!!
... View more
05-07-2016
12:21 PM
@David Lyle thanks, you're correct there's a version mismatch but not with Python. I had issues with easy_install, removed both references to it and installed setuptools through pip. Though I cannot comment on the final result, I'm passed this issue. cd /usr/local/bin
rm -rf easy_install*
curl https://bootstrap.pypa.io/ez_setup.py -o - | python
... View more
02-06-2016
12:23 AM
2 Kudos
@Henry Sowell Thank you for your help. I still don't what causes the problem but I found a solution which is similar to yours. I downloaded the RPM tarball (using wget) into the HTTP Server (that I already had for the other repositories which were synced with reposync). Then I created a local repository using the createrepo command and adapted the repo files on all hosts in the cluster. Afterwards the yum clients accepted the repo and downloaded the necessary packages. http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.0/bk_HDP_RelNotes/content/download-links-230.html
... View more