Support Questions
Find answers, ask questions, and share your expertise

Solr installation

Solved Go to solution
Highlighted

Re: Solr installation

Guru
@Ravi

Hey Ravi, thanks I have solved it by changing value of SOLR_HEAP to 1024 MB in /opt/lucidworks-hdpsearch/solr/bin/solr.in.sh. Thanks once again for all your help.

SOLR_HEAP="1024m"

[solr@m1 solr]$ /opt/lucidworks-hdpsearch/solr/bin/solr create -c test -d /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs_hdfs/conf -n test -s 2 -rf 2

Connecting to ZooKeeper at m1.hdp22:2181,m2.hdp22:2181,w1.hdp22:2181

Uploading /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs_hdfs/conf for config test to ZooKeeper at m1.hdp22:2181,m2.hdp22:2181,w1.hdp22:2181

Creating new collection 'test' using command:

http://192.168.56.42:8983/solr/admin/collections?action=CREATE&name=test&numShards=2&replicationFact...

{

"responseHeader":{

"status":0,

"QTime":8494},

"success":{"":{

"responseHeader":{

"status":0,

"QTime":8338},

"core":"test_shard1_replica1"}}}

Re: Solr installation

Rising Star

@Saurabh Kumar

You are welcome.

For the issue with Java heap space , its due to Java_Heap for Solr Process. By default Solr process is started with only 512MB. We can increase this by editing the Solr config files or via solr command line options as:

/opt/lucidworks-hdpsearch/solr/bin/solr -m 2g create -c test -d /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs_hdfs/conf -n test -s 2 -rf 2

This will resolve the Java heap space issue.

Highlighted

Re: Solr installation

Guru

Thanks @Ravi. I have solved it by changing value of SOLR_HEAP to 1024 MB in /opt/lucidworks-hdpsearch/solr/bin/solr.in.sh. Thanks once again for all your help.

SOLR_HEAP="1024m"

[solr@m1 solr]$ /opt/lucidworks-hdpsearch/solr/bin/solr create -c test -d /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs_hdfs/conf -n test -s 2 -rf 2

Highlighted

Re: Solr installation

New Contributor

Hi Saurabh,

I know this is an older question, but if you (or anyone else) is still looking to monitor Solr Cloud via Ambari, laying a custom service on top of your existing installation might be useful. The following will allow you to integrate Solr Cloud into Ambari, complete with alerts and the ability to start, stop, and monitor status.

This setup assumes an existing, standard Solr Cloud installation, with the Sold Cloud UI available on port 8983.


On the Ambari node, create /var/lib/ambari-server/resources/stacks/HDP/2.0.6/services/SOLR/package/scripts.

In /var/lib/ambari-server/resources/stacks/HDP/2.0.6/services/SOLR, create alerts.json and metainfo.xml, as follows (you can, of course, change the version to whatever version of Solr you have installed):

alerts.json

{
  "SOLR": {
    "service": [],
    "SOLR_CLOUD": [
      {
        "name" : "solr_cloud_ui",
        "label" : "Solr Cloud UI",
        "description" : "This host-level alert is triggered if the Solr Cloud Web UI is unreachable.",
        "interval" : 1,
        "scope" : "ANY",
        "source" : {
          "type" : "WEB",
          "uri" : {
            "http" : "http://0.0.0.0:8983",
            "connection_timeout" : 5.0
          },
          "reporting" : {
            "ok" : {
              "text" : "HTTP {0} response in {2:.3f}s"
            },
            "warning" : {
              "text" : "HTTP {0} response from {1} in {2:.3f}s ({3})"
            },
            "critical" : {
              "text" : "Connection failed to {1} ({3})"
            }
          }
        }
      }
    ]
  }
}

metainfo.xml

<?xml version="1.0"?>
<metainfo>
  <schemaVersion>2.0</schemaVersion>
  <services>
    <service>
      <name>SOLR</name>
      <displayName>Solr</displayName>
      <comment>Solr is an open source enterprise search platform, written in Java, from the Apache Lucene project.</comment>
      <version>5.2.1</version>
      <components>
        <component>
          <name>SOLR_CLOUD</name>
          <displayName>Solr Cloud Server</displayName>
          <category>MASTER</category>
          <cardinality>1+</cardinality>
          <commandScript>
            <script>scripts/solrcloud.py</script>
            <scriptType>PYTHON</scriptType>
            <timeout>600</timeout>
          </commandScript>
        </component>
      </components>
    </service>
  </services>
</metainfo>

In /var/lib/ambari-server/resources/stacks/HDP/2.0.6/services/SOLR/package/scripts, create params.py and solrcloud.py, as follows:

params.py

cloud_stop = ('/sbin/service', 'solr', 'stop')
cloud_start = ('/sbin/service', 'solr', 'start')
cloud_pid_file = '/opt/lucidworks-hdpsearch/solr/bin/solr-8983.pid'

solrcloud.py

from resource_management import *
from resource_management.core.resources.system import Execute

class Master(Script):
  def install(self, env):
    print 'Installing Solr Cloud';

  def stop(self, env):
    import params
    env.set_params(params)

    Execute((params.cloud_stop), sudo=True)

  def start(self, env):
    import params
    env.set_params(params)

    Execute((params.cloud_start), sudo=True)

  def status(self, env):
    import params
    env.set_params(params)

    from resource_management.libraries.functions import check_process_status

    check_process_status(params.cloud_pid_file)

  def configure(self, env):
    print 'Configuring Solr Cloud';

if __name__ == "__main__":
  Master().execute()

At this point, after restarting Ambari, you will be able to "install" Solr Cloud via the Ambari Add Service wizard, specifying a Solr Cloud Server on whichever hosts Solr is already installed. As you might note from solrcloud.py, the installation doesn't do anything other than configure Ambari to be aware that the components exist on the hosts.

Once the installation is complete, Solr will be listed as an Ambari Service, with each Solr Cloud server listed as an individual Master component.

Hope this helps.

Joe