Member since
01-19-2017
3620
Posts
599
Kudos Received
360
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
982 | 04-06-2023 12:49 PM | |
474 | 10-26-2022 12:35 PM | |
980 | 09-27-2022 12:49 PM | |
1118 | 05-27-2022 12:02 AM | |
967 | 05-26-2022 12:07 AM |
06-03-2019
05:16 AM
1 Kudo
@Rohit Sharma I suspect your ambari-server is overwhelmed. How long has your installation been in use? You should think of purging some old data. SELECT table_schema "Database Name", SUM( data_length + index_length)/1024/1024 "Database Size (MB)"
FROM information_schema.TABLES where table_schema = 'mydb'; The solution of purge ambari history by hortonworks You should be able to load your page after the above action. use the below statement to get the size of the ambari database HTH
... View more
05-30-2019
08:51 PM
@Koffi This is a permission issue which can be resolved by changing the access using the hdfs user who happens to be the HDFS superuser The error Permission denied: user=t_hdhusr, access=READ_EXECUTE, inode="/user/T_HDHUSR":T_HDHUSR:hdfs:drwx------ As the root user # su- hdfs
$ hdfs dfs -chown -R t_hdhusr:hdfs /user/T_HDHUSR It seems t_hdhuser and T_HDHUSR is not interpreted as the same Now running the below should succeed $ hdfs dfs -ls /user/T_HDHUSR HTH
... View more
05-30-2019
10:00 AM
1 Kudo
@Hak Tiong Ong That for sure is obvious means you yum repo could be a problem can you run this independently on that host? $ sudo yum install java-1.8.0-openjdk-devel You can use this command to locate your JAVA_HOME $ java -XshowSettings:properties -version Then use the JAVA_HOME when running your ambari-server setup!!
... View more
05-29-2019
04:29 PM
@Adrián Gil Ping me on LinkedIn I could help you remotely
... View more
05-29-2019
03:48 PM
@Farhana Khan Any updates on this issue?
... View more
05-28-2019
05:00 PM
@Adrián Gil Let's discard the ALIAS what I suggest is completely remove it and have your /etc/hosts look like this 10.61.2.10 bigdatapruebas.es
::1 localhost ip6-localhost ip6-loopback
ff02::1 ip6-allnodes
ff02::02 ip6-allrouters Stop the ambari-server and agent # ambari-server stop
# ambari-agent stop Create a file host_names_changes.json file with hostnames changes. Contents of the host_names_changes.json # Ambari host name change {
"bigdata" : {
"old_name_here" : "bigdatapruebas.es"
}
}
Update the host name for the various components # Hive grant all privileges on hive.* to 'hive'@'bigdatapruebas.es' identified by 'hive_password';
grant all privileges on hive.* to 'hive'@'bigdatapruebas.es' with grant option; # Oozie grant all privileges on oozie.* to 'hive'@'bigdatapruebas.es' identified by 'oozie_password';
grant all privileges on oozie.* to 'hive'@'bigdatapruebas.es' with grant option; # Ranger grant all privileges on ranger.* to 'hive'@'bigdatapruebas.es' identified by 'ranger_password';
grant all privileges on ranger.* to 'hive'@'bigdatapruebas.es' with grant option; # Rangerkms grant all privileges on rangerkms.* to 'hive'@'bigdatapruebas.es' identified by 'rangerkms_password';
grant all privileges on rangerkms.* to 'hive'@'bigdatapruebas.es' with grant option; Change these 3 values in /etc/ambari-server/conf/ambari.properties to the new hostname server.jdbc.rca.url=
server.jdbc.url=jdbc=
server.jdbc.hostname= Edit the ambari-agent.ini under server replace the hostname with new Ambari hostname [server]
hostname=bigdatapruebas.es
url_port=8440
secured_url_port=8441
connect_retry_delay=10
max_reconnect_retry_delay=30 Start the ambari-agent # ambari-agent start Start the Ambari # ambari-server start Open the web browser http://bigdatapruebas.es:8080 Start all components Please revert
... View more
05-27-2019
08:59 PM
@Muhammad waqas The Kerberos ticket error is due to lack of configuration for cross REALM authentication, this is what you need to do to successfully copy files between 2 REALM. Kerberos cross-realm trust for distcp Following diligently the steps mentioned here.
... View more
05-27-2019
06:57 PM
@Adrián Gil You succeeded but encountering heartbeat lost because your /etc/hosts entry is wrong 🙂 I really can't understand how you can connect 🙂 The below entry is wrong and shouldn't resolve that somehow explains why you had difficulty in running the host_names_changes.json bigdata@bigdata:~$ sudo cat /etc/hosts
10.61.2.10 bigdatapruebas.es bigdata.adurizenergia.es bigdata.es bigdata
Should be the /etc/host entry should be exactly the output of $ hostname -f
bigdatapruebas.es So your /etc/hosts should look like below $ sudo cat /etc/hosts
10.61.2.10 bigdatapruebas.es But if you had an FQDN like bigdata.endesa.es the entry could be IP FQDN ALIAS $ sudo cat /etc/hosts
IP FQDN ALIAS
------------------------------------------------
10.61.2.10 bigdata.endesa.es bigdata With the above entry in the /etc/hosts, you can access ambari successfully in 2 ways http://bigdata.endesa.es:8080 and http://bigdata:8080 Please do the necessary changes and revert
... View more
05-27-2019
06:04 PM
@Shesh Kumar I am happy this compilation has helped give you a better understanding If you found this answer addressed your question, please take a moment to log in and click the "accept" link on the answer. That would be a great help to Community users to find the solution quickly for these kinds of errors. Happy Hadooping
... View more
05-27-2019
06:01 PM
@Shesh Kumar Sorry for that I think the option is only when you have attained a certain level of mastery Guru about 600 points I see you have 63 points Post it as an idea then a moderator might promote it to a KB or HCC reference and give you points for that unfortunately I am not a HWX employee. HTH
... View more