Member since
09-21-2015
85
Posts
75
Kudos Received
7
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1954 | 04-21-2016 12:22 PM | |
4814 | 03-12-2016 02:19 PM | |
1706 | 10-29-2015 07:50 PM | |
2079 | 10-02-2015 04:21 PM | |
5763 | 09-29-2015 03:08 PM |
10-27-2015
10:25 PM
1 Kudo
@Sowmya Ramesh - Good answer!
Since this was specific to "recipes", can you move the recipe part of the question to the top of your answer and keep the further detail on "parallel" and "order" after that.
Then I can accept.
... View more
10-09-2015
09:24 AM
@Matt Carter just a bump to confirm if one of these answers worked, or reply to them for clarification.
... View more
10-03-2015
08:13 AM
When using the Falcon "Mirror Recipe", what happens if an instance is still running when the next is scheduled to start?
... View more
Labels:
- Labels:
-
Apache Falcon
10-02-2015
04:47 PM
3 Kudos
Impersonation is a key concept throughout the Hadoop ecosystem. Impersonation grants a user (also known as a SuperUser or ProxyUser) right to access Hadoop user is granted on behalf of other users. It's similar to the idea of 'sudo' within Linux. To enable it you set the 'proxyuser' setting based on the user the service is running as, the groups or users you want it to be able to act on behalf of, and the hosts it should be able to do that from. For example, for Ambari Views with:
Ambari running as the user 'root' (which is the default) Wanting to allow Ambari to act on behalf of users in the groups 'users', 'hive-users' (just an example as you may have similar groups in LDAP) Ambari hostname of 'ambarihost.domain.local' You would set this in 'HDFS -> core-site' from Ambari: hadoop.proxyuser.root.groups=users,hive-users
hadoop.proxyuser.root.hosts=ambarihost.domain.local More detail is available in the documentation:
Apache Hadoop: Proxy user - Superusers Acting On Behalf Of Other Users Apache Oozie: User ProxyUser Configuration Apache YARN: yarn-site
yarn.resourcemanager.webapp.proxyuser.USERNAME.groups yarn.resourcemanager.webapp.proxyuser.USERNAME.hosts
... View more
10-02-2015
04:32 PM
Please note that one should never use * for these settings. hosts= should be set to that of the Ambari Server
groups= should only be the groups which Ambari (running as root) is allowed to impersonate.
... View more
10-02-2015
04:22 PM
Error is: 2015-10-01 21:01:17,660 - Error : [Errno 110] Connection timed out
... View more
10-02-2015
04:21 PM
You showed this error: Error : [Errno 110] Connection timed out That means the Ranger KMS services cannot reach the Ranger Admin service (http://rangeradminhost:6080). Verify that "External URL" is set properly for Ranger & Ranger KMS. Typically it's `http://hostnameofrangeradmin:6080` Common mistakes I've seen: Putting a `/` on the end. It won't work with this. Using a hostname which is not resolvable internally between all of the hosts. (The name "External URL" is confusing since it's really the "Internal URL") Such as setting to `http://localhost:6080` which will only work from the Ranger Admin host
... View more
09-29-2015
08:41 PM
LDAPS is not required for syncing Ambari users. The question is in regards to the Kerberos Wizard.
... View more
09-29-2015
03:08 PM
3 Kudos
As part of the process to Kerberize the cluster, Ambari must connect to the Active Directory environment using LDAPS to create the relevant Kerberos "principals". But LDAPS is not enabled by default in Active Directory. To configure it and prepare the cluster hosts:
Enable LDAPS in ActiveDirectory (detailed by Microsoft) Trust the AD certificate on the Linux hosts. Only needed if "self-signing" the certificate. General steps for 2:
On the Windows host:
Server Manager -> Tools -> Certificate Authority Action -> Properties General Tab -> View Certificate -> Details -> Copy to File Choose the format: "Base-64 encoded X.509 (.CER)" Save as 'activedirectory.cer' (or whatever you like) Open with Notepad -> Copy Contents On all Linux hosts (RedHat/CentOS instructions. Ubuntu/SUSE would be similar)
Create /etc/pki/ca-trust/source/anchors/activedirectory.pem Paste the contents of the certificate file above Execute:
sudo yum -y install ca-certificates
sudo update-ca-trust force-enable
sudo update-ca-trust extract
sudo update-ca-trust check
(You can automate this as done here)
... View more
09-28-2015
10:42 PM
4 Kudos
The 'hadoop' command-line natively supports Encryption Zones so most file system operations (copyFromLocal, put, distCp) will work. See more example usage in the Apache Hadoop docs. And note that there are some considerations for copying between encrypted and unencrypted locations.
... View more
- « Previous
- Next »