Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2725 | 04-27-2020 03:48 AM | |
| 5285 | 04-26-2020 06:18 PM | |
| 4450 | 04-26-2020 06:05 PM | |
| 3576 | 04-13-2020 08:53 PM | |
| 5380 | 03-31-2020 02:10 AM |
08-22-2017
04:51 PM
@ROHIT AILA If you want to connect using Phoenix APIs (phoenix URL"jdbc:phoenix:") then you will have to load Driver Class of type " "org.apache.phoenix.jdbc.PhoenixDriver" import org.apache.phoenix.jdbc.PhoenixDriver;
Class.forName("org.apache.phoenix.jdbc.PhoenixDriver");
Statement stmt = null;
ResultSet rset = null;
Connection con = DriverManager.getConnection("jdbc:phoenix:zk1,zk2,zk3:2181:/hbase-unsecure"); .
... View more
08-22-2017
08:43 AM
@Dominik Ludwig This HCC thread looks duplicate of the following thread: https://community.hortonworks.com/questions/129949/communication-between-ambari-instances.html?childToView=129951#answer-129951 . Yes Once we can access the rest API of the remote cluster then we can extract the information that the ambari server itself is able to extract from it's Database including the information's about the configurations/hosts/metrics (metrics Data is fetched from the AMS data store). Although ambari server fetches some informations from the JMX URLs of the individual components like NameNode/DataNode .. those informations are not accessible via ambari APIs as they are directly received from the Components.
... View more
08-22-2017
08:36 AM
@Dominik Ludwig Yes When we configure the "Remote Cluster" then in that case we will have to know 2 things about the remote cluster: 1. The cluster API URI: http://$REMOTE_AMBARI_HOSTNAME:8080/api/v1/clusters/$CLUSTER_NAME 2. The Remote Cluster Ambari Admin Credentials ( like admin/admin) . Once we provide the above two details in our Remote Cluster configuration then we can extract almost every details from the remote cluster Like various configurations/resources/statistics ...etc. . Ambari Makes use of the Rest APIs fully to communicate with the remote cluster. https://docs.hortonworks.com/HDPDocuments/Ambari-2.5.1.0/bk_ambari-administration/content/registering_remote_clusters.html
... View more
08-22-2017
08:27 AM
@Johannes Mayer There can be two approaches 1. To Suppress the alerts (if they are getting cleared quickly), Means increasing the "timeout" and "check interval" of those alerts. 2. Finding the resources that might be causing the issue like looking at the Memory # free -m
# top
# Logs of individual components like DataNode & NodeManager (When they were not responding) were there any Long GC pause? Looking at the GC log of DataNode/NodeManager can be helpful to know if there were Long GC pauses?
# /var/log/messages to see any abnormal behaviour during the time period.
# SAR Report (Historical Data Capturing of the OS which includes various statistocs including IO/CPU/Memory ...etc).
. .
... View more
08-22-2017
07:48 AM
1 Kudo
@Johannes Mayer The "Ambari Server Alerts" normally occurs if the Ambari Agents are not reporting alert status, Agents are not running (Or due to some Load on the Host/Agent machine the Agent is not able to respond to Ambari Server) then we can see this kind of alert. . We can check the heartbeat messages of the agents in the "/var/log/ambari-agent/ambari-agent.log" file. If some heartbeats were lost when this issues happened. If that is happening as soon as we trigger a job then it indicates that the Job might be utilizing lots of N/W bandwidth OR Sockets OR system resources etc. . The rest two alerts are WebUI alerts from the NodeManager and DataNode , it might be due to Long GC pause or resource limitations on those hosts that When the job was running these processes could not respond to ambari agents request to access the Web UI which may be due to slow /W call as well. Increasing the UI alert "Connection timeout" from 5 to 15 or more seconds and "Check Interval" to 2-3 minutes can help in reducing the CRITICAL messages. . In General this can happen broadly due to Resource Limitations (Either N/W bandwidth / Load OR the Host over loaded with memory/cpu).
... View more
08-22-2017
07:32 AM
@Dominik Ludwig Using Ambari API is the best approach instead of developing a view /tool to mofigy the ambari DB. Ambari APIs are very matured and ambari UI itself makes those API calls to fetch the details / perform operations / monitor resources. So i think in future there might be many enhancements in the ambari API ... but it wont affect current ambari API usages or fundamental behavior of API calls. So i will suggest you to rely on the ambari API instead of Direct DB modification. . There is a best link available on Ambari API usage and it's various usage for managing & monitoring resources: https://github.com/apache/ambari/blob/trunk/ambari-server/docs/api/v1/index.md . Most of the details can be extracted using Ambari APIs (that we can extract from the Database directly)
... View more
08-22-2017
07:18 AM
@Dominik Ludwig Although you can refer to the ER diagram of ambari 2.4 as mentioned in the following link: https://community.hortonworks.com/storage/attachments/13698-ambari-database-er-diagram.pdf .
... View more
08-22-2017
07:11 AM
1 Kudo
@Dominik Ludwig There are not official doc on the usage of each and every Ambari Database table. But you can refer to the Ambari DB schema and it's relation with other tables by looking at the SQL files that ambari uses to setup the ambari DB. # # ls -lart /var/lib/ambari-server/resources/*.sql
-rwxr-xr-x. 1 root root 81282 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-SQLServer-CREATE.sql
-rwxr-xr-x. 1 root root 827 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-Postgres-EMBEDDED-DROP.sql
-rwxr-xr-x. 1 root root 1232 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-Postgres-EMBEDDED-CREATE.sql
-rwxr-xr-x. 1 root root 1337 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-Postgres-DROP.sql
-rwxr-xr-x. 1 root root 79172 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-Postgres-CREATE.sql
-rwxr-xr-x. 1 root root 2160 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-Oracle-DROP.sql
-rwxr-xr-x. 1 root root 85484 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-Oracle-CREATE.sql
-rwxr-xr-x. 1 root root 1192 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-MySQL-DROP.sql
-rwxr-xr-x. 1 root root 80064 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-MySQL-CREATE.sql
-rwxr-xr-x. 1 root root 96945 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-AzureDB-CREATE.sql
-rwxr-xr-x. 1 root root 2117 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-SQLServer-DROP.sql
-rwxr-xr-x. 1 root root 4215 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-SQLServer-CREATELOCAL.sql
-rwxr-xr-x. 1 root root 0 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-SQLAnywhere-DROP.sql
-rwxr-xr-x. 1 root root 84134 May 27 02:11 /var/lib/ambari-server/resources/Ambari-DDL-SQLAnywhere-CREATE.sql . There are some thirdparty tools available that can produce graph from the SQL Schema file. - But updating the DB entries directly is not recommended approach (except extreme scenarios when we have no other options left). So you should rely on the tools offered by ambari like "config.sh" script for managing the configurations, Rest APIs for managing and monitoring Ambari resources. .
... View more
08-22-2017
06:53 AM
@Dominik Ludwig Yes, ambari allows creating and registering our own custom ambari views. We can drop out custom View JAR/WAR inside the following directory. /var/lib/ambari-server/resources/views/ Even there are many examples available with ambari code base to explain how it can be done. https://github.com/apache/ambari/tree/trunk/ambari-views/examples .
... View more
08-22-2017
04:04 AM
@Amithesh Merugu If you want to copy a File "F:/dev/jars sparktest.jar" from your local file system to the HDFS location "/user/maria_dev" then you will need the following: 1. Switch to "hdfs" user ( OR else you will have to login to the shell as "maria_dev" to wrtie inside the user directory Or any username who has write access insie the "/user/maria_dev" directory # su - hdfs 2. Now you can use the "put" switch to place the file with dfs command as following: # hdfs dfs -put F:/dev/jars sparktest.jar /user/maria_dev 3. Now you should be able to list the file: # hdfs dfs -ls /user/maria_dev .
... View more