Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 785 | 06-04-2025 11:36 PM | |
| 1364 | 03-23-2025 05:23 AM | |
| 675 | 03-17-2025 10:18 AM | |
| 2456 | 03-05-2025 01:34 PM | |
| 1598 | 03-03-2025 01:09 PM |
10-25-2016
05:08 PM
@Gary Cameron Yap that was an error but I am happy all is okay for you now .
... View more
10-24-2016
07:42 PM
2 Kudos
mysql -u root -p
CREATE USER ‘<HIVEUSER>’IDENTIFIED BY ‘<HIVEPASSWORD>’;
FLUSH PRIVILEGES;
mysql -u root -hive
create database hive;
FLUSH PRIVILEGES;
# mysql -u ambari -p
mysql> CREATE DATABASE <ambaridb>;
mysql> USE <ambaridb>;
mysql> SOURCE /var/lib/ambari-server/resources/Ambari-DDL-MySQL-CREATE.sql;
mysql>quit
# yum install mysql-connector-java
# chmod 644 /usr/share/java/mysql-connector-java.jar
# ambari-server setup
Checking JDK...
Enter advanced database configuration [y/n] (n)? y Configuring database...
Choose No 3 Mysql
.....
....
...ambari-admin-2.1.0.1470.jar ...
Adjusting ambari-server permissions and ownership...
Ambari Server 'setup' completed successfully.
Now continue with the Hive setup all should run successfully mysql -u root -p
CREATE USER ‘<HIVEUSER>’IDENTIFIED BY ‘<HIVEPASSWORD>’;
FLUSH PRIVILEGES;
mysql -u hive
create database hive;
FLUSH PRIVILEGES;
... View more
10-18-2016
05:08 PM
1 Kudo
@suresh krish When your hadoop cluster is being accessed by 1000's of users its best to use SSO hence AD/LDAP. For easy management of user credentials and maybe corporate security settings When you logon a node in an Hadoop cluster it basically gives you access to all the resources because say you logged on as TOM evenif someone had stolen your credentials it will believe you are indeed TOM and so will YARN and other components which in modern IT infrastruture is very dangerous with all the hacking ,DOS attacks etc. In a Kerberized environment Hadoop wont believe you are TOM it will ask you for a ticket analogy of a Passport at an Airport and to make sure the passport is not forged like the Migrations do it will check your ticket (passport) against its database to ascertain it was not stolen !!! ONLY after validating that you are really TOM then it will allow you to run queries or jobs on that cluster. That's quiet assuring isn't it. for documentation there should be some in this forum. If not I will need to mask some data if I am to provide you my production integration documentation. Happy Hadooping
... View more
10-16-2016
07:34 AM
@tauqeer khan Try this solution 1.create .ldif file, add the following line to the file, save & exit out: "dn: cn=global_policy,cn=DOMAINL,cn=EXAMPLE,dc=EXAMPLE,dc=COM changetype: modify replace: krbMinPwdLife krbMinPwdLife: 0" 2. note: you need to know the directory manager password
run: ldapmodify -h localhost -x -W -D "cn=directory manager" -f /root/test/krb_test.ldif 3. now reset the password through kadmin.local: kadmin.local
Authenticating as principal admin/admin@EXAMPLE.COM with password.
kadmin.local: change_password -pw secret123 admin@EXAMPLE.COM
Password for "admin@EXAMPLE.COM" changed.
kadmin.local: q 4. Run this command to clear cache
kdestroy 5. Run "kimit admin" to login KDC using new password [root@bddec1v1-0019 ~]# kinit admin
Password for admin@EXAMPLE.COM: [root@bddec1v1-0019 ~]# klist
Ticket cache: FILE:/tmp/krb5cc_0 Default principal: admin@EXAMPLE.COM Valid starting Expires Service principal ....... .... Or [root@bddec1v1-0019 ~]# kadmin Authenticating as principal self/admin@DOMAIN.TLD with password. Password for self/admin@DOMAIN.TLD:
kadmin: getprivs current privileges: GET ADD MODIFY DELETE kadmin: cpw someuser Enter password for principal "someuser@DOMAIN.TLD":
Re-enter password for principal "someuser@DOMAIN.TLD":
P assword for "someuser@DOMAIN.TLD" changed. kadmin: quit
... View more
10-13-2016
07:13 PM
@Roberto Sancho What are the contents of your /etc/yum.repos.d/* you should be having ambari.repo,hdp.repo etc where does the baseurl point to ?
... View more
10-13-2016
06:55 PM
@prabhavathi Muthusenapathy Welcome on board !!!!! The recently released HDF 2.0 or Nifi is a game changer and I think you should start from a good base. Deployment of HDF 2.0 is now provisoned by Ambari. Grab this Document and proceed. Take the full advantage of integration with Ranger,SSL etc ....... CAUTION You should not install HDF 2.0 on an existing Ambari managed HDP node or cluster Here are 3 more sites that will walk you through the different setups ....why not use blueprints ... link1 link2 Link3 Hope that helps
... View more
10-08-2016
03:41 PM
@Muthukumar S The log below show your ambari is listening on 8440. Can you try telnet 172.27.3.42 8440 And also make sure the Ambari database derby or Mysql is running ! 06Oct201613:12:23,611 WARN [qtp-ambari-agent-52]SecurityFilter:62-This request is not allowed on this port: https://ip-172-27-3-42.ap-southeast-1.compute.internal:8440/ca Hope that helps
... View more
10-05-2016
09:31 AM
1 Kudo
@Yukti Agrawal I am not suire you can access https on 8080 ? If you run the ambari security setup then it should have by defaulted 8443 see doc my existing cluster, when i try to access https://<ambari server hostname>:8080 then its only showing "loading..."
... View more
09-29-2016
07:55 PM
@Ramy Mansour I found this interesting to read it could help ! Link
... View more
09-29-2016
07:18 PM
If you are using local repo then dont point it to the public repo and make sure yoour satelitte server can serve at that URL baseurl=http://10.78.1.240/AMBARI-2.2.1.0/centos6/2.2.1.0-161/
gpgcheck=0
enabled=0
priority=1 yum clean all
yum install ambari-server That should work
... View more