Created 01-28-2016 11:44 PM
Are there any recommendations / guidelines / best practices from Hortonworks or any other Hadoop provider in order to setup and maintain a cluster of any size?
Thanks in advance
Created 01-28-2016 11:47 PM
Deploy - Ambari Operations tool
Security - Ranger and Kerberos
Data Governance - Falcon and Atlas
Now, Get familiar with http://docs.hortonworks.com as it has all the resources that you need to deploy and maintain the cluster.
Start with this
Created 01-28-2016 11:47 PM
Deploy - Ambari Operations tool
Security - Ranger and Kerberos
Data Governance - Falcon and Atlas
Now, Get familiar with http://docs.hortonworks.com as it has all the resources that you need to deploy and maintain the cluster.
Start with this
Created 01-28-2016 11:50 PM
@Greenhorn Techie HCC has really good articles and content link
Created 01-28-2016 11:57 PM
@Neeraj Sabharwal Thanks for this. I was looking slightly technical and more specific coverage, primarily from an administrative perspective. For example,
1. Better split master services onto various nodes
2. what is the recommendation in choosing hardware when different work loads are present - think of lambda architecture etc.
3. What kind of disks are best suited and any particular configuration to arrange them (for example RAID?) when they are going to be explicitly used by Kafka brokers and not share the disks with Datanodes.
Thanks
Created 01-29-2016 12:02 AM
@Greenhorn Techie You need to read this and also follow Hortonworks on slideshare.
Created 01-29-2016 12:20 AM
Thanks @Neeraj Sabharwal This is precisely what I was looking for. Best Regards
Created 01-29-2016 01:51 AM
@Greenhorn Techie Thanks to @Artem Ervits for bringing this up
SmartSense is MUST
Created 01-29-2016 12:48 AM
@Greenhorn Techie perfect !
Please accept the answer to close the thread. Great question !!
Created 01-29-2016 01:45 AM
@Greenhorn Techie and install SmartSense, it will push best practices to you.