Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2636 | 11-01-2016 05:43 PM | |
| 8821 | 11-01-2016 05:36 PM | |
| 4958 | 07-01-2016 03:20 PM | |
| 8299 | 05-25-2016 11:36 AM | |
| 4461 | 05-24-2016 05:27 PM |
01-27-2016
04:13 PM
@Raja Sekhar Chintalapati You have full control through Ambari. I highly recommend to install HS2 using ambari
... View more
01-27-2016
02:29 PM
@sangeeta rawat https://spark.apache.org/docs/1.0.2/sql-programming-guide.html
... View more
01-27-2016
02:24 PM
1 Kudo
@sangeeta rawat It's not supported yet. You can leverage sparksql to access hive
... View more
01-27-2016
01:23 PM
@Dhanooj kolathuparmabil I suggest to use ambari to add falcon in the stack
... View more
01-27-2016
01:07 PM
@Dhanooj kolathuparmabil
See this http://hortonworks.com/hadoop-tutorial/defining-processing-data-end-end-data-pipeline-apache-falcon/ Manual install http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4/bk_installing_manually_book/content/ch_installing_falcon_chapter.html
... View more
01-27-2016
12:56 PM
@suresh kumar You can download the document. See this If you are looking for technical doc then I am afraid that I don't have any template or generic as it's always customer driven based on the engagement.
... View more
01-27-2016
12:55 PM
@Özgür Akdemirci They look good to me. Also, we can add client tools in any node after the install if we have to
... View more
01-27-2016
12:49 PM
@suresh kumar You have to build/implement your own standards. It's like with any software stack. You have a software install, data is being stored and users accessing it. HDP is a platform and it comes with security solutions that you can leverage to meet some of security requirements , rest you have to build or rely on 3rd part solutions. See this
... View more
01-27-2016
12:46 PM
@Özgür Akdemirci You can install clients in ambari server and one of data nodes. Region server will be on data nodes (not master nodes) Node manager on data nodes
... View more
01-27-2016
12:31 PM
1 Kudo
@suresh kumar
HDP is a platform and you have to build/implement your own compliance standards around it. Ranger for Authorization, Auditing , Centralized admin console to manage policies Kerberos is MUST - Authentication Data encryption at rest - TDE or your preferred vendor You have to implement your own scripts to fullfil following requirements. Password expiration every xx days and that includes service accounts too. Auditing and more auditing ..anything that touches any part of the stack needs to be audited (Ranger and HDFS audit log is helpful) Password complexity Failed login attempts Data encryption in motion Data Retention - Data must expire after specific time otherwise you would have to retain the data for longer time (Falcon can help) You can read this http://hortonworks.com/blog/hadoop-security-enterprise/
... View more