Support Questions

Find answers, ask questions, and share your expertise

ports required to be open

avatar
Super Collaborator

Hi All,

to install/run HDP using Ambari, there are many ports such as 50070 needs to be open

However on cloud platforms keeping these ports open creates risks

Is there a way to keep them accessible from the services however blocked from outside internet.

Thanks,

Avijeet

1 ACCEPTED SOLUTION

avatar
Super Guru
@Avijeet Dash

In cloud environment you create cluster within a VPC (AWS) or Azure Virtual network which becomes an extension of your own network. In addition both cloud environments (and other major ones) offers network ACLs. You are not really opening up ports to DMZ. Any practical deployment should use these features regardless of Hadoop.

View solution in original post

3 REPLIES 3

avatar
Super Guru
@Avijeet Dash

In cloud environment you create cluster within a VPC (AWS) or Azure Virtual network which becomes an extension of your own network. In addition both cloud environments (and other major ones) offers network ACLs. You are not really opening up ports to DMZ. Any practical deployment should use these features regardless of Hadoop.

avatar
Super Collaborator

Thanks @mqureshi

Can you pls confirm for a cluster deployed without VPC - is there any way to secure Hadoop with all these ports open?

Thinking of KNOX as one way - anything else that can be done quickly, also will KNOX work without LDAP/AD?

Regards,

Avijeet

avatar
Super Guru

The only thing you can do is limit which IP's can access your cluster. Basically specifying security rules for inbound traffic (or outbound also).

http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-network-security.html#ec2-classic-security-...