Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

failed to start nutch with hbase

avatar
Expert Contributor

InjectorJob: org.apache.gora.util.GoraException: java.lang.RuntimeException: java.lang.RuntimeException: Current heap configuration for MemStore and BlockCache exceeds the threshold required for successful cluster operation. The combined value cannot exceed 0.8. Please check the settings for hbase.regionserver.global.memstore.upperLimit and hfile.block.cache.size in your configuration. at org.apache.gora.store.DataStoreFactory.createDataStore(DataStoreFactory.java:167) at org.apache.gora.store.DataStoreFactory.createDataStore(DataStoreFactory.java:135) at org.apache.nutch.storage.StorageUtils.createWebStore(StorageUtils.java:75) at org.apache.nutch.crawl.InjectorJob.run(InjectorJob.java:221) at org.apache.nutch.crawl.InjectorJob.inject(InjectorJob.java:251) at org.apache.nutch.crawl.InjectorJob.run(InjectorJob.java:273) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.nutch.crawl.InjectorJob.main(InjectorJob.java:282) Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Current heap configuration for MemStore and BlockCache exceeds the threshold required for successful cluster operation. The combined value cannot exceed 0.8. Please check the settings for hbase.regionserver.global.memstore.upperLimit and hfile.block.cache.size in your configuration. at org.apache.gora.hbase.store.HBaseStore.initialize(HBaseStore.java:127) at org.apache.gora.store.DataStoreFactory.initializeDataStore(DataStoreFactory.java:102) at org.apache.gora.store.DataStoreFactory.createDataStore(DataStoreFactory.java:161) ... 7 more Caused by: java.lang.RuntimeException: Current heap configuration for MemStore and BlockCache exceeds the threshold required for successful cluster operation. The combined value cannot exceed 0.8. Please check the settings for hbase.regionserver.global.memstore.upperLimit and hfile.block.cache.size in your configuration. at org.apache.hadoop.hbase.HBaseConfiguration.checkForClusterFreeMemoryLimit(HBaseConfiguration.java:77) at org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:90) at org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:100) at org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:110) at org.apache.gora.hbase.store.HBaseStore.initialize(HBaseStore.java:108) ... 9 mor

1 ACCEPTED SOLUTION

avatar
Super Guru
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login
9 REPLIES 9

avatar
Super Guru
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login

avatar
Expert Contributor

the value for hbase.regionserver is always 0.4 and dont exceed 0.8

<property> <name>hbase.regionserver.global.memstore.size</name> <value>0.4</value> </property>

avatar
Super Guru

What about the value for hfile.block.cache.size?

avatar
Expert Contributor

the value for hfile.block.cache is 0.3 so 0.3 +0.4 =0.7 <0.8 so does not exceed 0.8

avatar
Master Collaborator

Please check the value of

hbase.regionserver.global.memstore.upperLimit

avatar
Expert Contributor

i dont have hbase.regionserver.global.memstore.upperLimit i have just hbase.regionserver.global.memstore.size

avatar
Master Collaborator

Which version of hbase is used by nutch ?

Thanks

avatar
Expert Contributor

hbase

2.3.0.0

avatar
Master Collaborator

For HDP 2.3 (Apache 1.1.2),

./hbase-common/src/main/java/org/apache/hadoop/hbase/HBaseConfiguration.java calls

HeapMemorySizeUtil.checkForClusterFreeMemoryLimit(conf);

There is no HBaseConfiguration.checkForClusterFreeMemoryLimit

Can you double check your classpath to see which hbase related jars are present.

Please pastebin those jars

Thanks