Member since
12-23-2016
9
Posts
6
Kudos Received
0
Solutions
01-03-2017
05:48 PM
@Vivek Sharma If this answer helps, please accept it. Otherwise, I'd be happy to answer any remaining questions you have.
Thanks! _Tom
... View more
01-03-2017
06:20 PM
Thanks @Dominika Bialek and @Tom McCuch By AWS public cloud, I mean EC2-Classic.
... View more
01-03-2017
05:37 PM
1 Kudo
Hi @Vivek Sharma When you are creating a cluster, the "Instance Role" parameter allows you to configure S3 access. By default, a new S3 role is created to grant you access to S3 data in your AWS account. See "Instance Role" in Step 7 at http://docs.hortonworks.com/HDPDocuments/HDCloudAWS/HDCloudAWS-1.8.0/bk_hdcloud-aws/content/create/index.html In addition, there are ways to authenticate with S3 using keys or tokens: http://docs.hortonworks.com/HDPDocuments/HDCloudAWS/HDCloudAWS-1.8.0/bk_hdcloud-aws/content/s3-security/index.html. @Ram Venkatesh
... View more
01-03-2017
05:46 PM
@Vivek Sharma If this answer helps, please accept it. Otherwise, I'd be happy to answer any remaining questions you have.
Thanks! _Tom
... View more
12-26-2016
02:13 PM
2 Kudos
HDC 1. HDC is backed by S3, that's the primary way to offer HA. There are no options to deploy HA namenode, RS, etc as clusters are meant to be terminated once used. 2. Plans for enterprise support will be announced in Q1. Cloudbreak 1. Yes, you can set up alerts to trigger upscale or downscale based on utilization. 2. Yes you can deploy HA across all HA-compatible components. 3. I'd say Cloudbreak makes things easier overall but one could argue you lose some control vs. manual install. Also there might be a short learning curve to learn Cloudbreak but everything is relative. On the flip side, if you learn Cloudbreak once, you can now have a choice to deploy identical infrastructure on another cloud provider. It's all about choice. I recommend you reach out to a local hwx representative who can pull the right team of experts to work through your pains and adivse a solution.
... View more