- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
How to add NiFi and Solr using Ambari
- Labels:
-
Apache Solr
Created ‎08-24-2017 05:24 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Team ,
I want to added Solr and NiFi into my exiting Ambari Cluster . Please help .
Best regards
~Kishore
Created ‎08-24-2017 06:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Additionally please refer to:
=> Download the Ambari management pack to the Ambari Server host. In this example, /tmp is a temporary directory that stores the management pack before it is installed. cd /tmp wget http://public-repo-1.hortonworks.com/HDP-SOLR/hdp-solr-ambari-mp/solr-service-mpack-2.2.8.tar.gz
=> Install the management pack on the Ambari Server host, using the following command: # ambari-server install-mpack --mpack=/tmp/solr-service-mpack-2.2.8.tar.gz You should see the following output: Using python /usr/bin/python Installing management pack Ambari Server 'install-mpack' completed successfully. The management pack has now been added to Ambari. => Add the Solr service, either during initial cluster installation using the Ambari installation wizard or after cluster deployment.
.
Created ‎08-24-2017 05:28 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We can use the HDF management pack and Ambari to add HDF services to an HDP cluster.
You can install "mpacks" to include the HDF components inside the HDP cluster. as described in the following link:
.
After installing the mpack you should be able to add the HDF components like (NIFI) to ambari cluster.
.
Created ‎08-24-2017 05:54 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks @Jay SenSharma
With this i am able to get the Nifi .
Solr didnt show up into "Choose Services " still .
Created ‎08-24-2017 05:59 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can use "Ambari Infra": https://docs.hortonworks.com/HDPDocuments/Ambari-2.5.1.0/bk_ambari-operations/content/ch_ambari_infr...
the Ambari Infra Service has only one component: the Infra Solr Instance. The Infra Solr Instance is a fully managed Apache Solr installation. By default, a single-node SolrCloud installation is deployed when the Ambari Infra Service is chosen for installation; however, you should install multiple Infra Solr Instances so that you have distributed indexing and search for Atlas, Ranger, and LogSearch (Technical Preview). To install multiple Infra Solr Instances, you simply add them to existing cluster hosts through Ambari’s +Add Service capability. The number of Infra Solr Instances you deploy depends on the number of nodes in the cluster and the services deployed. . Infra Solr Instance is intended for use only by HDP components; use by third-party components or applications is not supported.
.
Created ‎08-24-2017 06:04 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Additionally please refer to:
=> Download the Ambari management pack to the Ambari Server host. In this example, /tmp is a temporary directory that stores the management pack before it is installed. cd /tmp wget http://public-repo-1.hortonworks.com/HDP-SOLR/hdp-solr-ambari-mp/solr-service-mpack-2.2.8.tar.gz
=> Install the management pack on the Ambari Server host, using the following command: # ambari-server install-mpack --mpack=/tmp/solr-service-mpack-2.2.8.tar.gz You should see the following output: Using python /usr/bin/python Installing management pack Ambari Server 'install-mpack' completed successfully. The management pack has now been added to Ambari. => Add the Solr service, either during initial cluster installation using the Ambari installation wizard or after cluster deployment.
.
Created ‎08-24-2017 08:41 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Jay SenSharma While installing the Solr from Ambari , i am getting following exception
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install lucidworks-hdpsearch' returned 1. Error: failure: repodata/a2e514aba11a5426bbdbb7744c3d68f3ca97510b-primary.sqlite.bz2 from Delivery-SharedHosting-RedHat-6.5Server-x86_64: [Errno 256] No more mirrors to try. stdout:.
Created ‎08-24-2017 09:13 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Similar to another thread: https://community.hortonworks.com/questions/122908/having-issue-while-setting-up-the-cluster-using-a...
Execution of '/usr/bin/yum -d 0 -e 0 -y install lucidworks-hdpsearch' returned 1. Error: failure: repodata/a2e514aba11a5426bbdbb7744c3d68f3ca97510b-primary.sqlite.bz2
.
Please try the yum cleanup
yum clean all
.
Created ‎08-24-2017 09:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Jay SenSharma . Tried that option didn't work .
Created ‎08-24-2017 10:11 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Your repodata does not see to be right .. Either you are using private offline repo which has corrupted entry OR you are using incorrect repo.
Can you please check if you are able to do wget as following on your repo?
Example:
# wget http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6/repodata/7fcacadb44b1a860fb704... # wget http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6/repodata/repomd.xml
.
In your case the following path seems to be incorrect: OR your "/etc/yum.repos.d/xxxx.repo" is not correct.
repodata/a2e514aba11a5426bbdbb7744c3d68f3ca97510b-primary.sqlite.bz2"
.
Created ‎10-27-2017 04:17 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
what is the difference bt solr installting using MPACK and solr installing using Ambari infra for Ranger? is it lke Ambari infra is specific to Ranger.? @Jay SenSharma
,what is difference between solr installation using MPACK and Solar installation using Ambari infra service for ranger? ie. is it Ambari infra is specific to Ranger or I can use either of them (MPACK or Ambari infra) as alternatives to each other? or one of it being used for old versions of hadoop @Jay SenSharma
