Member since
05-29-2017
408
Posts
123
Kudos Received
9
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2787 | 09-01-2017 06:26 AM | |
1699 | 05-04-2017 07:09 AM | |
1460 | 09-12-2016 05:58 PM | |
2069 | 07-22-2016 05:22 AM | |
1626 | 07-21-2016 07:50 AM |
07-22-2016
05:22 AM
I have solved it by applying following steps. - Stop all Solr instances
- Stop all Zookeeper instances
- Start all Zookeeper instances
- Start Solr instances one at a time.
... View more
07-21-2016
07:50 AM
I have solved it by setting up <boolname="solr.hdfs.blockcache.direct.memory.allocation">false</bool> in solrconfig.xml file.
... View more
07-21-2016
07:16 AM
@james.jones Thanks a lot. It helped me a lot. I have successfully pushed current config to zookeeper.
... View more
07-20-2016
12:59 PM
This will upload your config directory to a configset named "testcoll". The default configset name is the same as your collection ./zkcli.sh -zkhost localhost:2181 -cmd upconfig -confdir ../../solr/configsets/data_driven_schema_configs/conf -confname testcoll. If your configset is called testcoll, then do this to show the contents of the solrconfig.xml in zookeeper: ./zkcli.sh -zkhost localhost:2181 -cmd get /configs/testcoll/solrconfig.xml I recommend running the list command which will dump everything in zookeeper, not just listing files but will print the contents of the files. That's a bit too much, so just pipe it to "less" and then search for your collection name as you would with vi (with / and ? to search). Then you'll see the path to your configs.
... View more
12-30-2016
02:48 PM
Hi Saurabh,
I know this is an older question, but if you (or anyone else) is still looking to monitor Solr Cloud via Ambari, laying a custom service on top of your existing installation might be useful. The following will allow you to integrate Solr Cloud into Ambari, complete with alerts and the ability to start, stop, and monitor status.
This setup assumes an existing, standard Solr Cloud installation, with the Sold Cloud UI available on port 8983.
On the Ambari node, create /var/lib/ambari-server/resources/stacks/HDP/2.0.6/services/SOLR/package/scripts. In /var/lib/ambari-server/resources/stacks/HDP/2.0.6/services/SOLR, create alerts.json and metainfo.xml, as follows (you can, of course, change the version to whatever version of Solr you have installed): alerts.json {
"SOLR": {
"service": [],
"SOLR_CLOUD": [
{
"name" : "solr_cloud_ui",
"label" : "Solr Cloud UI",
"description" : "This host-level alert is triggered if the Solr Cloud Web UI is unreachable.",
"interval" : 1,
"scope" : "ANY",
"source" : {
"type" : "WEB",
"uri" : {
"http" : "http://0.0.0.0:8983",
"connection_timeout" : 5.0
},
"reporting" : {
"ok" : {
"text" : "HTTP {0} response in {2:.3f}s"
},
"warning" : {
"text" : "HTTP {0} response from {1} in {2:.3f}s ({3})"
},
"critical" : {
"text" : "Connection failed to {1} ({3})"
}
}
}
}
]
}
}
metainfo.xml <?xml version="1.0"?>
<metainfo>
<schemaVersion>2.0</schemaVersion>
<services>
<service>
<name>SOLR</name>
<displayName>Solr</displayName>
<comment>Solr is an open source enterprise search platform, written in Java, from the Apache Lucene project.</comment>
<version>5.2.1</version>
<components>
<component>
<name>SOLR_CLOUD</name>
<displayName>Solr Cloud Server</displayName>
<category>MASTER</category>
<cardinality>1+</cardinality>
<commandScript>
<script>scripts/solrcloud.py</script>
<scriptType>PYTHON</scriptType>
<timeout>600</timeout>
</commandScript>
</component>
</components>
</service>
</services>
</metainfo>
In /var/lib/ambari-server/resources/stacks/HDP/2.0.6/services/SOLR/package/scripts, create params.py and solrcloud.py, as follows: params.py cloud_stop = ('/sbin/service', 'solr', 'stop')
cloud_start = ('/sbin/service', 'solr', 'start')
cloud_pid_file = '/opt/lucidworks-hdpsearch/solr/bin/solr-8983.pid'
solrcloud.py from resource_management import *
from resource_management.core.resources.system import Execute
class Master(Script):
def install(self, env):
print 'Installing Solr Cloud';
def stop(self, env):
import params
env.set_params(params)
Execute((params.cloud_stop), sudo=True)
def start(self, env):
import params
env.set_params(params)
Execute((params.cloud_start), sudo=True)
def status(self, env):
import params
env.set_params(params)
from resource_management.libraries.functions import check_process_status
check_process_status(params.cloud_pid_file)
def configure(self, env):
print 'Configuring Solr Cloud';
if __name__ == "__main__":
Master().execute()
At this point, after restarting Ambari, you will be able to "install" Solr Cloud via the Ambari Add Service wizard, specifying a Solr Cloud Server on whichever hosts Solr is already installed. As you might note from solrcloud.py, the installation doesn't do anything other than configure Ambari to be aware that the components exist on the hosts. Once the installation is complete, Solr will be listed as an Ambari Service, with each Solr Cloud server listed as an individual Master component. Hope this helps. Joe
... View more
07-06-2016
11:02 AM
Yes @sujitha sanku. Nifi was running fine and I have create twitter id as well. Actually there was a firewall problem, now I am able to run it through my personal internet. Thanks for your response.
... View more
07-04-2016
11:09 AM
It was firewall issue as I tried knox through webui I came to know this firewall issue. Thanks . http://<knox_server>:8443/ It should show below as expected. HTTP ERROR: 404 Problem accessing /. Reason: Not Found
... View more
06-05-2018
05:46 PM
It worked for me. Thanks!
... View more
06-23-2016
04:02 AM
@Saurabh Kumar hortonworks and teradata are partners. They have built documentation on the hdp teradata connector here.
... View more
12-06-2018
03:05 AM
I saw this error today. Apparently my hdfs file was in .csv but my table structure was in ORC.
... View more