Support Questions

Find answers, ask questions, and share your expertise

Hbase read latency metrics shows 200000000ms . How to reduce this latency

avatar
Expert Contributor

One of my cluster shows hbase read latency of 200000000ms which is impacting cluster performance and hence certain jobs. Need to resolve this as early as possible.

3131-hbase-metricspic.jpg

1 ACCEPTED SOLUTION

avatar
Expert Contributor

Thanks for the input guys. I increased RS heap to 16GB and increased handler count to 200. Also we performed manual major compaction on few HBase tables majorly used by customer. After that HBase read performance dropped to acceptable limits.

View solution in original post

6 REPLIES 6

avatar
Rising Star

Hi, could you provide more details like the slow regionserver's log and the hmaster's log? By the way, is there any RIT regions in hmaster's UI page?

avatar
Super Guru

That's a latency of 55hrs or 2.3days for a read. That seems extremely unlikely that HBase would even be functioning if that's the case as the default RPC timeout for HBase is on the order of a few minutes. It seems like that graph may be inaccurate.

avatar
Expert Contributor

I dont see any RIT on HBase webui but I see read requests of 7M & 3M on few datanodes. Attaching logs. Is it advisable to restart ambari-server in this case

avatar
Expert Contributor

avatar
Master Guru

Hi @Anshul Sisodia, can you try to restart Ambari Metrics, and if you just did some Ambari config changes then also restart HBase. Also, I see that you have many minor GC compactions, they are fast but happen too often. OTOH, you have allocated only 4G for your RS, and have about 50G free on both RS nodes. Can you ramp them up to 16G, or at least 8G, and increase New gen to at least 1G. Either way, that latency looks wrong.

avatar
Expert Contributor

Thanks for the input guys. I increased RS heap to 16GB and increased handler count to 200. Also we performed manual major compaction on few HBase tables majorly used by customer. After that HBase read performance dropped to acceptable limits.