Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2625 | 11-01-2016 05:43 PM | |
| 8763 | 11-01-2016 05:36 PM | |
| 4941 | 07-01-2016 03:20 PM | |
| 8271 | 05-25-2016 11:36 AM | |
| 4437 | 05-24-2016 05:27 PM |
01-23-2016
11:28 AM
@Michel Meulpolder Check if there is any other job running. Most of the times, default queue is blocked because of previous job is consuming all the resources and your job just sits and wait. You can use Ambari to check the job listing. Your job id is application_1453543525470_0004 login to the box and run yarn application -list yarn application -kill appid
... View more
01-23-2016
04:57 AM
6 Kudos
Labels:
- Labels:
-
Apache Ranger
01-23-2016
04:34 AM
2 Kudos
@Kevin Vasko I believe you are asking for SSO, Single Sign On. SSO and Knox integration works. http://hortonworks.com/blog/hadoop-security-today-and-tomorrow/ Perimeter level Security With Apache Knox
Apache Hadoop has Kerberos for authentication. However, some organizations require integration with their enterprise identity management and Single Sign-On (SSO) solutions. Hortonworks created Apache Knox Gateway (Apache Knox) to provide Hadoop cluster security at the perimeter for REST/HTTP requests and to enable the integration of enterprise identity-management solutions. Apache Knox provides integration with corporate identity systems such as LDAP, Active Directory (AD) and will also integrate with SAML based SSO and other SSO systems. Apache Knox also protects a Hadoop cluster by hiding its network topology to eliminate the leak of network internals. A network firewall may be configured to deny all direct access to a Hadoop cluster and accept only the connections coming from the Apache Knox Gateway over HTTP. These measures dramatically reduce the attack vector. Finally, Apache Knox promotes the use of REST/HTTP for Hadoop access. REST is proven, scalable, and provides client interoperability across languages, operating systems, and computing devices. By using Hadoop REST/HTTP APIs through Knox, clients do not need a local Hadoop installation.
... View more
01-23-2016
12:55 AM
@Predrag Minovic Both of them are GEMS ...Now, take a look on this http://docs.hortonworks.com/HDPDocuments/Ambari-2.2.0.0/bk_releasenotes_ambari_2.2.0.0/content/ambari_relnotes-2.2.0.0-new-features.html Jira. https://issues.apache.org/jira/browse/AMBARI-13431
... View more
01-23-2016
12:53 AM
@rbalam Based on this "The history can be stored in memory or in a leveldb database store; the latter ensures the history is preserved over Timeline Server restarts.The single-server implementation of the Timeline Server places a limit on the scalability of the service; it also prevents the service being High-Availability component of the YARN infrastructure." You can research on http://leveldb.org/ and see if you want to replication. I don't think this all is supported.
... View more
01-23-2016
12:31 AM
@Predrag Minovic I am assuming that you are looking for a way to automate the security integration. This link has really nice content that you can help to meet the requirement ...Thanks to @Ali Bajwa https://github.com/abajwa-hw/ambari-workshops/blob/master/blueprints-demo-security.md
... View more
01-23-2016
12:27 AM
1 Kudo
@Predrag Minovic This is your best shot https://cwiki.apache.org/confluence/display/AMBARI/Automated+Kerberizaton
... View more
01-23-2016
12:23 AM
1 Kudo
Hi @Andrea Squizzato Make sure that you meet the system requirements http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4-Win/bk_QuickStart_HDPWin/content/SystemRequirements.html
... View more
01-22-2016
11:56 PM
1 Kudo
@rbalam @jeff Honestly, I would prefer some control on the upgrade process because of the complication and changes involve. It's the matter of educating and communicating to users. I don't think there is 100% automated way to upgrade/downgrade HDP as of now
... View more