Step 1: Do not put anything into the cloud unless you have a CISO, Chieft Security Architect, Certified Cloud Administrator, full understanding of your PII and private data, a Lawyer to defend you against the coming lawsuits, full understanding of Hadoop, Hadoop Certified Administrators, a Hadoop premier support contract, a security plan, full understanding of your Hadoop architecture and layout.
Step 2: Study all running services in Ambari.
Step 3: Confirm and check all of your TCP/IP ports. Hadoop has a lot of them!
Step 4: if you are not using a service, do not run it.
Step 5: By default, disable all access to everything, always. Only open ports and access when something and someone critical cannot access them.
Step 6: SSL, SSH, VPN and Encryption Everywhere.
Step 7: Run Knox! Set it up correctly.
Step 8: Run Kali and audit all your IPs and ports.
Step 9: Use Kali hacking tools to attempt to access all your web ports, shells and other access points.
Step 10: Run in a VPC
Step 11: Setup security groups. Never open to 0.0.0.0 or all ports or all IPs!?!??!?!!!
Step 12: If this seems too hard, don't run in the cloud.
Step 14: Step 13 is unlucky, skip that one.
Step 15: Read all the recommended security documentation and use it.
Step 16: Kerberize everything.
Step 17: Run Metron
My recommendation is get a professional services contract with an experience Hadoop organization or use something like Microsoft HDInsight or HDC that is managed.
There's more of these if you are also running your own visualization tools, other data websites, other tools, Oracle, SQL Server, mail, NiFi, Druid, etc...