Member since
01-19-2017
3676
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 593 | 06-04-2025 11:36 PM | |
| 1143 | 03-23-2025 05:23 AM | |
| 572 | 03-17-2025 10:18 AM | |
| 2158 | 03-05-2025 01:34 PM | |
| 1357 | 03-03-2025 01:09 PM |
08-08-2017
11:14 AM
@uri ben-ari I always use a json validator, I don't know any tool that can that could help but if the value don't match your json wont be accepted and the output will give you hints as to which parameter. Also see Ambari blueprints
... View more
08-08-2017
10:45 AM
@uri ben-ari, You can generate your new blueprint.json from an existing cluster replacing the {ambari_host} and {Cluster_name} with correct values curl -H "X-Requested-By: ambari" -X GET -u admin:admin http://{ambari_host}:8080/api/v1/clusters/{Cluster_name}\?format\=blueprint > blueprint.json Hope that helps
... View more
08-08-2017
10:35 AM
@uri ben-ari The .yml file is specific to the OS setup you run in this case the IBM... Ambari blueprint uses the .json using curl. 1- What do you tend to achieve?
... View more
08-08-2017
10:14 AM
@uri ben-ari The .yml is a configuration file It's basically a human-readable structured data format "YAML stands for "YAML Ain't Markup Language" and it is used extensively in Grav for its configuration files, blueprints, and also in page settings. YAML is to configuration what markdown is to markup."
In this case the your configuration file is "myBlueprint.yml"
The json file with the -d @updatedBlueprint.json should be in the current directory where you are running the curl command.
... View more
08-03-2017
04:40 PM
@Aaron Norton What are the host entries in your ambari server /etc/hosts ? Is the LDAP server entry correct ? Please revert
... View more
08-01-2017
04:36 PM
@steve coyle The problem is with the below param oozie_user_nproc_limit check this document
... View more
08-01-2017
03:41 PM
@steve coyle Have you checked in limits_conf_dir = /etc/security/limits.d
... View more
08-01-2017
03:35 PM
1 Kudo
@Hema Penumatsa Can you try to reinstall it on the node that's causing a problem,you will first need to stop the agent, then run the below commands. # yum install epel-release
# yum erase ambari-agent
# yum install -y ambari-agent Then edit the ambari-agent.ini, and under the Server change the host to your Ambari server FQDN vi /etc/ambari-agent/conf/ambari-agent.ini [server]
hostname={your.ambari.server.hostname} Then start the agent. This should resolve the problem
... View more
08-01-2017
10:25 AM
1 Kudo
@@Smart Data If you intend to run a secure Hadop cluster then there is no way you can avoid Kerberos. Below are the difference between knox and kerberos. The Apache Knox Gateway is a system that provides a single point of authentication and access. It provides the following features:
Single REST API Access Point Centralized authentication, authorization and auditing for Hadoop REST/HTTP services LDAP/AD Authentication, Service Authorization and Audit Eliminates SSH edge node risks Hides Network Topology LAYERS OF DEFENSE FOR A HADOOP CLUSTER
Perimeter Level Security – Network Security, Apache Knox (gateway) Authentication : Kerberos Authorization OS Security : encryption of data in network and HDFS Apache Knox can also access a Hadoop cluster over HTTP or HTTPS CURRENT FEATURES OF APACHE KNOX
Authenticate : by LDAP or Cloud SSO Provider Provides services for HDFS, HCat, HBase, Oozie, Hive, YARN, and Storm HTTP access for Hive over JDBC support is available (ODBC driver Support- In Future) Hope that helps to explain.
... View more
08-01-2017
10:24 AM
@@Smart Data If you intend to run a secure Hadop cluster then there is no way you can avoid Kerberos. Below are the difference between knox and kerberos. The Apache Knox Gateway is a system that provides a single point of authentication and access. It provides the following features:
Single REST API Access Point Centralized authentication, authorization and auditing for Hadoop REST/HTTP services LDAP/AD Authentication, Service Authorization and Audit Eliminates SSH edge node risks Hides Network Topology LAYERS OF DEFENSE FOR A HADOOP CLUSTER
Perimeter Level Security – Network Security, Apache Knox (gateway) Authentication : Kerberos Authorization OS Security : encryption of data in network and HDFS Apache Knox can also access a Hadoop cluster over HTTP or HTTPS CURRENT FEATURES OF APACHE KNOX
Authenticate : by LDAP or Cloud SSO Provider Provides services for HDFS, HCat, HBase, Oozie, Hive, YARN, and Storm HTTP access for Hive over JDBC support is available (ODBC driver Support- In Future) Hope that helps to explain.
... View more