Created on 12-24-2016 11:33 AM
PROBLEM: Ambari service check for Solr fails when the active namenode is nn2. From the std-err, you will see log below
ERROR:
stderr: /var/lib/ambari-agent/data/errors-803.txt Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/service_check.py", line 48, in <module> ServiceCheck().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/SOLR/5.5.2.2.5/package/scripts/service_check.py", line 43, in service_check user=params.solr_config_user File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 273, in action_run tries=self.resource.tries, try_sleep=self.resource.try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 71, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 93, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 141, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 294, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of '/opt/lucidworks-hdpsearch/solr/bin/solr create_collection -c collection1 -d data_driven_schema_configs -p 8983 -s 2 -rf 1 >> /var/log/service_solr/solr-service.log 2>&1' returned 1.
Below was error message in solr log -
2016-09-15 17:04:49,886 [qtp1192108080-19] ERROR [ ] org.apache.solr.update.SolrIndexWriter (SolrIndexWriter.java:135) - Error closing IndexWriter java.net.ConnectException: Call From dummyhost/0.0.0.0 to dummyhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731) at org.apache.hadoop.ipc.Client.call(Client.java:1472) at org.apache.hadoop.ipc.Client.call(Client.java:1399) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) at com.sun.proxy.$Proxy10.getListing(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:554) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy11.getListing(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1969) at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1952) at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:693) at org.apache.hadoop.hdfs.DistributedFileSystem.access$600(DistributedFileSystem.java:105) at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:755) at org.apache.hadoop.hdfs.DistributedFileSystem$15.doCall(DistributedFileSystem.java:751) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:751) at org.apache.solr.store.hdfs.HdfsDirectory.listAll(HdfsDirectory.java:168) at org.apache.lucene.store.FilterDirectory.listAll(FilterDirectory.java:57) at org.apache.lucene.store.NRTCachingDirectory.listAll(NRTCachingDirectory.java:101) at org.apache.lucene.store.FilterDirectory.listAll(FilterDirectory.java:57) at org.apache.lucene.index.IndexFileDeleter.refresh(IndexFileDeleter.java:426) at org.apache.lucene.index.IndexWriter.rollbackInternalNoCommit(IndexWriter.java:2099) at org.apache.lucene.index.IndexWriter.rollbackInternal(IndexWriter.java:2041) at org.apache.lucene.index.IndexWriter.shutdown(IndexWriter.java:1083) at org.apache.lucene.index.IndexWriter.close(IndexWriter.java:1125) at org.apache.solr.update.SolrIndexWriter.close(SolrIndexWriter.java:130) at org.apache.solr.update.DirectUpdateHandler2.closeWriter(DirectUpdateHandler2.java:832) at org.apache.solr.update.DefaultSolrCoreState.closeIndexWriter(DefaultSolrCoreState.java:85) at org.apache.solr.update.DefaultSolrCoreState.close(DefaultSolrCoreState.java:358) at org.apache.solr.update.SolrCoreState.decrefSolrCoreState(SolrCoreState.java:73) at org.apache.solr.core.SolrCore.close(SolrCore.java:1225) at org.apache.solr.core.SolrCore.closeAndWait(SolrCore.java:1015) at org.apache.solr.core.CoreContainer.unload(CoreContainer.java:994) at org.apache.solr.handler.admin.CoreAdminOperation$2.call(CoreAdminOperation.java:144) at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:354) at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:153)
ROOT CAUSE: This is a product defect BUG-68180.
RESOLUTION:
Adding below link to solr-config-env content from ambari. Place it below JAVA_HOME. Which resolved the issue.
export SOLR_HDFS_CONFIG=/etc/hadoop/conf