Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

ports required to be open

Expert Contributor

Hi All,

to install/run HDP using Ambari, there are many ports such as 50070 needs to be open

However on cloud platforms keeping these ports open creates risks

Is there a way to keep them accessible from the services however blocked from outside internet.

Thanks,

Avijeet

1 ACCEPTED SOLUTION

Super Guru
@Avijeet Dash

In cloud environment you create cluster within a VPC (AWS) or Azure Virtual network which becomes an extension of your own network. In addition both cloud environments (and other major ones) offers network ACLs. You are not really opening up ports to DMZ. Any practical deployment should use these features regardless of Hadoop.

View solution in original post

3 REPLIES 3

Super Guru
@Avijeet Dash

In cloud environment you create cluster within a VPC (AWS) or Azure Virtual network which becomes an extension of your own network. In addition both cloud environments (and other major ones) offers network ACLs. You are not really opening up ports to DMZ. Any practical deployment should use these features regardless of Hadoop.

Expert Contributor

Thanks @mqureshi

Can you pls confirm for a cluster deployed without VPC - is there any way to secure Hadoop with all these ports open?

Thinking of KNOX as one way - anything else that can be done quickly, also will KNOX work without LDAP/AD?

Regards,

Avijeet

Super Guru

The only thing you can do is limit which IP's can access your cluster. Basically specifying security rules for inbound traffic (or outbound also).

http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-network-security.html#ec2-classic-security-...

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.