- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Is Horton Works Community Connection site using Hadoop for processing Data???
- Labels:
-
Apache Hadoop
Created ‎12-15-2015 08:40 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Community connection doing fantastic job by providing solutions to the User's Problems.But My Question is like,Are Developers using Hadoop or related field in developing this??? If yes,what are those???
Created ‎12-15-2015 04:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
As @Jonas Straub states we are using AnswerHub for our functionality. The backend database is MySQL, but here is their stack: http://docs.answerhub.com/articles/1073/answerhub-architecture.html
Created ‎12-15-2015 09:15 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
At the moment Hortonworks Community is using AnswerHub as a platform. I believe they are using Django as a backend framework, not sure though. As far as I know we have not connected any HDP tool, @Mark Herring has all the details.
We could definitely use some of the tools from our HDP stack to power different services of our community site. Are you working on a similar use case?
Created ‎12-15-2015 12:42 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am not working on any use case similar to community. I am working on Hadoop only. Just thinking how we can connect our HDP tools to power communities like this.
Created ‎12-15-2015 01:10 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Here are some use case for a community like this:
- Solr to power the search engine of the community
- Hive to analyze access logs and user interaction
- Kafka/Storm to prevent spam (kind of like FraudDetection for banks) and process posts (filter bad words, escape posts, etc.)
I also like the use case from Facebook, they are using HBase for their messages, chats, etc. (https://www.facebook.com/notes/facebook-engineering/the-underlying-technology-of-messages/4549916089...)
Created ‎12-15-2015 04:45 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
As @Jonas Straub states we are using AnswerHub for our functionality. The backend database is MySQL, but here is their stack: http://docs.answerhub.com/articles/1073/answerhub-architecture.html
Created ‎12-15-2015 05:38 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Wow cool! thanks for sharing the architecture pic 🙂
