Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 837 | 06-04-2025 11:36 PM | |
| 1413 | 03-23-2025 05:23 AM | |
| 705 | 03-17-2025 10:18 AM | |
| 2520 | 03-05-2025 01:34 PM | |
| 1649 | 03-03-2025 01:09 PM |
10-27-2017
11:28 AM
@Neha You never responded.
... View more
10-20-2017
09:01 AM
1 Kudo
@Florin Miron I think your problem is exactly what Jay SenSharma addressed As you using Python version "python-2.7.5" or higher, hence you should try to either downgrade the python version to lower than python-2.7.5 as it causes this issue. [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:579)<br> (OR) Else you will need to following the steps mentioned in the following doc to fix the "certificate verify failed (_ssl.c" issue while using RHEL7: Controlling and troubleshooting certificate verification https://access.redhat.com/articles/2039753#controlling-certificate-verification-7 Also we will suggest you to remove the Ambari Server certificates as mentioned in the following doc so that new certificates can be generated by ambari. https://cwiki.apache.org/confluence/display/AMBARI/Handling+Expired+HTTPS+Certificates https://community.hortonworks.com/articles/68799/steps-to-fix-ambari-server-agent-expired-certs.html
... View more
10-19-2017
10:35 AM
@Neha G This thread was opened by you and I did respond comprehensively,if the solution resolved your problem. Can you have the courtesy to mark this HCC thread as Answered by clicking on the "Accept" Button. That way other HCC users can quickly find the solution when they encounter the same issue. Needless to say members strive to help.
... View more
10-18-2017
10:13 AM
@uri ben-ari curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://{your_ambari_server}:8080/api/v1/clusters/{your_cluster_name}/hosts/{dead_host} Example curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://test.co.uk:8080/api/v1/clusters/hdp26/hosts/datanode2
... View more
10-18-2017
07:31 AM
@ilia kheifets I assume you have installed Postgres or Mysql and run the corresponding on host ubuntu17 ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar
ambari-server setup --jdbc-db=postgres --jdbc-driver=/usr/share/java/postgresql-jdbc.jar I have used the same user, password and database name for simplicity. For Postgres As the root switch to Postgres user # su - postgres
postgres@ubuntu17:~$ psql
psql (9.5.9) Type "help" for help. Hive user/database setup postgres=# DROP DATABASE if exists hive;
postgres=# CREATE USER hive PASSWORD 'hive';
postgres=# CREATE DATABASE hive OWNER hive;
postgres=# grant all privileges on database hive to hive;
postgres=# \q Oozie user/database setup postgres=# DROP DATABASE if exists oozie;
postgres=# CREATE USER oozie PASSWORD 'oozie';
postgres=# CREATE DATABASE oozie OWNER oozie;
postgres=# grant all privileges on database oozie to oozie;
postgres=# \q Ranger user/database setup postgres=# DROP DATABASE if exists rangerdb;
postgres=# CREATE USER rangerdb PASSWORD 'rangerdb';
postgres=# CREATE DATABASE rangerdb OWNER rangerdb;
postgres=# grant all privileges on database rangerdb to rangerdb;
postgres=# \q Edit the pg_hba.conf vi /etc/postgresql/9.5/main/pg_hba.conf at the end of the file in example below my ambari,hive and ranger are using postgres database local all ambari,hive,oozie,ranger,mapred md5
host all ambari,hive,oozie,ranger,mapred 0.0.0.0/0 md5
host all ambari,hive,oozie,ranger,mapred ::/0 md5 Then restart postgres /etc/init.d/postgresql restart For Mysql In this exampel I assume the root password is hadoop hive user # mysql -u root -phadoop
CREATE USER 'hive'@'localhost' IDENTIFIED BY 'hive';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'localhost';
CREATE USER 'hive'@'%' IDENTIFIED BY 'hive';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%';
CREATE USER 'hive'@'<HIVEMETASTORE_FQDN>' IDENTIFIED BY 'hive';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'<HIVEMETASTOREFQDN>'; FLUSH PRIVILEGES; Create the Hive database The Hive database must be created before loading the Hive database schema. mysql -u root -phadoop
CREATE DATABASE hive; quit; Oozie user setup mysql -u root -phadoop
CREATE USER 'hive'@'%' IDENTIFIED BY 'hive';
GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%'; FLUSH PRIVILEGES; Create the Oozie database The oozie database must be created before loading the oozie database schema. mysql -u root -phadoop
CREATE DATABASE oozie; After either the MySQL or Postrges setup, now in the Ambari UI hive setup see attached screenshots, you will need to use the credentials setup earlier. Choose use existing PostgreSQL/MySQL database In the initial setup, it will ask you to test the connectivity between hive and Postgres this MUST succeed.In both setups just make sure for both cases the below correct entries are chosen. Hive Database
Database Name
Database Username
Database Password
JDBC Driver class
Database URL
Hive Database Type Hope that helps
... View more
10-18-2017
06:21 AM
@uri ben-ari Check /etc/hosts of the Ambari server and make sure the old entries are removed ,realce with the new one. DELETE all host components mapped to this host in this case DATANODE curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://AMBARI_SERVER_HOST:8080/api/v1/clusters/CLUSTERNAME/hosts/HOSTNAME/host_components/DATANODE DELETE the host curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://AMBARI_SERVER_HOST:8080/api/v1/clusters/CLUSTERNAME/hosts/HOSTNAME And you will need to restart all stale config
... View more
10-17-2017
04:54 PM
@Neha G In a kerberized cluster there are 2 types of keytabs or principals headless and service principals. Headless principals are not bound to a specific host or node and are presented like @ SRV.COM Service principals are bound to a specific service and host or node, and are presented like with syntax: /@ SRV.COM So when you initialize the hdfs.headless.keytab is as DoAs so the user will take hdfs permissions
... View more
10-17-2017
02:51 PM
@ilia kheifets Can you enlighten me I see a mixture of Mysql and postgres command in your new posting. We can resolve your issue with ONLY postgres installation because the mixture looks confusing. It wont affect any service except Hive and oozie or Ranger iif you intend to and Ranger for authorization, authentication and administration of security policies
... View more
10-17-2017
02:36 PM
@ilia kheifets Is this the correct URL to you hive database? jdbc:hive2://ambari-master.test.com:10015/default I see error "Connection refused\' -e \'Invalid URL\'\' returned 1. Error" Can you walk me through the setup of the databases for hive,oozie? Was it with Mysql or Postgres? ambari-server setup -s ( -s silent install) should work with the embedded postgres, so I can tell why it didn't work in your case could one of the standard OS preparations that you ignored.
... View more
10-17-2017
02:18 PM
@Kumar Peddibhotla Please can you paste here how you proceeded step by step so I can validate, please remember to obscure ONLY import inputs like IP, REALM ,HOSTNAME etc It's important to see and hence be able to debug. Did you add Druid as a service with Ambari, how can I reproduce your errot?
... View more