Member since
07-30-2019
181
Posts
205
Kudos Received
51
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4991 | 10-19-2017 09:11 PM | |
1602 | 12-27-2016 06:46 PM | |
1244 | 09-01-2016 08:08 PM | |
1183 | 08-29-2016 04:40 PM | |
3029 | 08-24-2016 02:26 PM |
06-09-2016
02:14 PM
2 Kudos
@Johnny Fugers There are several Spark tutorials that use the Sandbox available on the Hortonworks website. You may be interested in the Interacting with Data on HDP Using Apache Zeppelin and Apache Spark tutorial. We also offer online training via Hortonworks University focused on Data Science and Spark.
... View more
06-09-2016
02:08 PM
1 Kudo
@Johnny Fugers Have you considered using NiFi to load the data? You can read from many different sources, merge the content into large enough portions to optimize the HDFS use, and write the data directly into HDFS.
... View more
06-08-2016
03:09 PM
2 Kudos
@Pardeep Gorla Absolutely! You can use an MIT KDC to provide Kerberos authentication. There are a couple of ways to do this. FreeIPA is a good tool that combines LDAP and KDC management for RedHat (CentOS) systems. This will give you the ability to also manage user sync for Ambari and Ranger with the OpenLDAP managed by FreeIPA. You will need to use the "Manually Manage Kerberos Principals" option when enabling Kerberos on the cluster for now. FreeIPA integration is on the roadmap for Ambari, but is not available yet as of Ambari 2.2.2.
... View more
06-08-2016
02:55 PM
3 Kudos
@mkataria The article you referenced does contain some good information about security exploits for the Microsoft Windows Active Directory KDC. Some of them require you to obtain certain keys or privileges in order to compromise security, some of them require access to the domain controller. This article is a bit dated as it is from a couple of years ago, and investigating some of the bugs mentioned shows that Microsoft has patched some of these holes. Other attacks can be secured against by understanding the attack and eliminating the access required to utilize the exploit. As with any computer system, the key is securing the systems. Keep users off of systems that they shouldn't have access to. If there's a memory exploit on a server, don't let users login to that server. If getting access to a file would compromise security, don't allow access to that file. The implications of being able to arbitrarily generate Kerberos tickets can have impacts in a Hadoop environment just as they would in any network. If a user can obtain a ticket to use HDFS, for example, that user may be able to access data that s/he shouldn't access. This is why security is such an important and complex topic. Ensuring that the various systems are secure individually AND together is key to ensuring the security of your information. To address the specific issues mentioned in this article, and to ensure the utmost security of your systems, I would recommend contacting Microsoft about them, determining which issues are applicable to your particular O/S version, and work with Microsoft on the best way to secure the domain controller against these attacks.
... View more
06-08-2016
02:22 PM
1 Kudo
@chennuri gouri shankar HDInsight does not include the full stack of HDP components. If you'd like to use Ranger and other components not included with HDI (e.g. Spark, Kafka, Storm), then you should look at using HDP on the Azure Marketplace. You can stand up a cluster quickly and use the full HDP stack.
... View more
06-07-2016
06:30 PM
1 Kudo
@R M When you create the SSH action, you can give Oozie the username and hostname to execute the command: <workflow-app name="[WF-DEF-NAME]" xmlns="uri:oozie:workflow:0.1">
...
<action name="[NODE-NAME]">
<ssh xmlns="uri:oozie:ssh-action:0.1">
<host>[USER]@[HOST]</host>
<command>[SHELL]</command>
<args>[ARGUMENTS]</args>
...
<capture-output/>
</ssh>
<ok to="[NODE-NAME]"/>
<error to="[NODE-NAME]"/>
</action>
...
</workflow-app>
The command needs to exist on the node you specify, and the command will be run as the user specified in the action definition from that user's home directory. This page of the oozie docs has some additional information as well.
... View more
06-07-2016
05:26 PM
@Kaliyug Antagonist The permissions that are given to the ambari agent user include those required to create service accounts, install packages, and start/stop all services, run commands as the service accounts, etc. Once the sudo rules are in place, you can install, start, stop, etc., all of the various services in the HDP stack.
... View more
06-07-2016
05:21 PM
1 Kudo
@Mohana Murali Gurunathan Ranger does not currently support authorization in Cassandra. JIRA Ranger-925 has been opened to add this functionality. You can track the progress there.
... View more
06-07-2016
04:01 PM
2 Kudos
@Kaliyug Antagonist There is a way to install Ambari and the HDP stack as a non-root user, but you will have to have someone with root privileges help you set up the access on the systems to do this. Certain sudo privileges need to be assigned to the user you're going to run the ambari agent as. You will need a root user to add these sudo configurations for you. The Ambari Security guide has a section about how to setup a non-root Ambari installation. One of the commands that will be given to the ambari user is the ability to install packages (via yum, zypper, etc.). Once these rules are in place, you can use these privileges to install the ambari-server and ambari-agent packages on all of your nodes and proceed with the installation.
... View more
06-07-2016
03:49 PM
@sbhat I've not seen a comprehensive document that details which commands are available for which services, but the Ambari CWiki page has some great usage scenarios and FAQs that may help.
... View more