Created 07-10-2017 06:47 AM
I have my java service and solr(single node solr-cloud),zookeeper running on same machine.Java service contains rest api for fetching data from solr and showing on ui.
Now when the load increases on that machine, like 100 users hitting the api to get result simulataneously,machine hangs up and solr crashes but restart after few minutes with entire data loss.Why the data is getting lost, solr crashes that is ok but data should be there.
Any help will be appreciated.
zookeeper Logs:
<code>[2017-06-1213:11:20,055] WARN caught end of stream exception (org.apache.zookeeper.server.NIOServerCnxn)EndOfStreamException:Unable to read additional data from client sessionid 0x15c78310667004e, likely client has closed socket at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:228) at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208) at java.lang.Thread.run(Thread.java:748)[2017-06-1214:40:16,978] WARN caught end of stream exception (org.apache.zookeeper.server.NIOServerCnxn)EndOfStreamException:Unable to read additional data from client sessionid 0x15c78310667006e, likely client has closed socket at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:228) at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208) at java.lang.Thread.run(Thread.java:748)
Solr Logs:
420206[searcherExecutor-7-thread-1] INFO org.apache.solr.core.SolrCore–[rwindex_shard1_replica1]Registerednew searcher Searcher@492f5449[rwindex_shard1_replica1] main{StandardDirectoryReader(segments_2:9:nrt)}420207[qtp1725097945-17] INFO org.apache.solr.update.processor.LogUpdateProcessor–[rwindex_shard1_replica1] webapp=/solr path=/update params={stream.body=<delete><query>*:*</query></delete>&commit=true}{deleteByQuery=*:*(-1572256668905897984),commit=}0323420224[qtp1725097945-17] ERROR org.apache.solr.servlet.SolrDispatchFilter–null:org.eclipse.jetty.io.EofException at org.eclipse.jetty.http.HttpGenerator.flushBuffer(HttpGenerator.java:914) at org.eclipse.jetty.http.AbstractGenerator.flush(AbstractGenerator.java:443) at org.eclipse.jetty.server.HttpOutput.flush(HttpOutput.java:100) at org.eclipse.jetty.server.AbstractHttpConnection$Output.flush(AbstractHttpConnection.java:1094) at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:297) at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:141) at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:229) at org.apache.solr.util.FastWriter.flush(FastWriter.java:137) at org.apache.solr.servlet.SolrDispatchFilter.writeResponse(SolrDispatchFilter.java:766) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:426) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:207) at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135) at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255) at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116) at org.eclipse.jetty.server.Server.handle(Server.java:368) at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489) at org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53) at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:942) at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1004) at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:640) at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235) at org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72) at org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543) at java.lang.Thread.run(Thread.java:748)Caused by: java.net.SocketException:Broken pipe (Write failed) at java.net.SocketOutputStream.socketWrite0(NativeMethod) at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111) at java.net.SocketOutputStream.write(SocketOutputStream.java:155) at org.eclipse.jetty.io.ByteArrayBuffer.writeTo(ByteArrayBuffer.java:375) at org.eclipse.jetty.io.bio.StreamEndPoint.flush(StreamEndPoint.java:164) at org.eclipse.jetty.io.bio.StreamEndPoint.flush(StreamEndPoint.java:194) at org.eclipse.jetty.http.HttpGenerator.flushBuffer(HttpGenerator.java:838)...35 more 420226[qtp1725097945-17] ERROR org.apache.solr.servlet.SolrDispatchFilter–null:org.eclipse.jetty.io.EofException at org.eclipse.jetty.http.HttpGenerator.flushBuffer(HttpGenerator.java:914) at org.eclipse.jetty.http.AbstractGenerator.flush(AbstractGenerator.java:443) at org.eclipse.jetty.server.HttpOutput.flush(HttpOutput.java:100) at org.eclipse.jetty.server.AbstractHttpConnection$Output.flush(AbstractHttpConnection.java:1094) at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:297)
Created 07-10-2017 02:38 PM
Likely the data you have indexed hasn't been flushed, then when Solr crashes the data is lost. You should consider change the flush settings to ensure the data is written to the index at a faster interval.
https://cwiki.apache.org/confluence/display/solr/IndexConfig+in+SolrConfig
Created 07-10-2017 03:19 PM
Is there any flush time also like flush.hours? If not and if i hadn't set any of the below two properties
<ramBufferSizeMB>100</ramBufferSizeMB> <maxBufferedDocs>1000</maxBufferedDocs>
Will the data always be in the memory until size reaches 100 MB?
In my case, after indexing data i am not having any update calls only select calls
Created 07-10-2017 03:24 PM
You can manually call a commit after indexing data using something like http://localhost:8983/solr/collection_name/update?commit=true.
Here is a link to information on autoCommit: https://cwiki.apache.org/confluence/display/solr/UpdateHandlers+in+SolrConfig