Created on 01-23-2017 06:14 AM - edited 09-16-2022 03:56 AM
Hi All,
to install/run HDP using Ambari, there are many ports such as 50070 needs to be open
However on cloud platforms keeping these ports open creates risks
Is there a way to keep them accessible from the services however blocked from outside internet.
Thanks,
Avijeet
Created 01-23-2017 06:20 AM
In cloud environment you create cluster within a VPC (AWS) or Azure Virtual network which becomes an extension of your own network. In addition both cloud environments (and other major ones) offers network ACLs. You are not really opening up ports to DMZ. Any practical deployment should use these features regardless of Hadoop.
Created 01-23-2017 06:20 AM
In cloud environment you create cluster within a VPC (AWS) or Azure Virtual network which becomes an extension of your own network. In addition both cloud environments (and other major ones) offers network ACLs. You are not really opening up ports to DMZ. Any practical deployment should use these features regardless of Hadoop.
Created 01-23-2017 06:28 AM
Thanks @mqureshi
Can you pls confirm for a cluster deployed without VPC - is there any way to secure Hadoop with all these ports open?
Thinking of KNOX as one way - anything else that can be done quickly, also will KNOX work without LDAP/AD?
Regards,
Avijeet
Created 01-23-2017 06:37 AM
The only thing you can do is limit which IP's can access your cluster. Basically specifying security rules for inbound traffic (or outbound also).