Reply
New Contributor
Posts: 1
Registered: ‎10-19-2016

SolrCloud Out of Memory

Hi

Were using SolrCloud 4.10.3 with a 3 node solr cluster with 2 collections of 3 shards each.

Collection 1: approx size: 15.3 GB Collection 2: size: 1.2GB

 

Our heap size is 8GB and off heap is 15GB. We have a realtime feed into solr for one of our collections (the other is pretty static). We are constantly getting an out of memory error.

 

Can anyone help us as to the reason? Should be we having additional shards to spread the load? Or do we need to keep giving more off heap memory? All the cloudera heap graphs show that we are find for heap space (we rarely go above 6.5GB) and GC pauses are not an issue.

 

Thanks

Highlighted
Cloudera Employee
Posts: 266
Registered: ‎01-09-2014

Re: SolrCloud Out of Memory

If you are getting an OOM error, it is likely due to bad queries that may be faceting on high cardinality fields. If you suspect these queries are causing OOM, you can turn on DEBUG logging of queries (display them in the logs before they execute) and then determine if you see a corresponding INFO entry after the query completes, or if the OOM happens before that.

To enable that logging, add the following to the logging safety valve for SOLR:
log4j.logger.org.apache.solr.core.SolrCore.Request=DEBUG

-PD