Member since
01-18-2016
163
Posts
32
Kudos Received
19
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1400 | 04-06-2018 09:24 PM | |
1425 | 05-02-2017 10:43 PM | |
3912 | 01-24-2017 08:21 PM | |
23905 | 12-05-2016 10:35 PM | |
6576 | 11-30-2016 10:33 PM |
12-06-2016
06:26 PM
Glad you have it worked out. Somehow I missed your last post stating that it is already installed, so my previous comment is OBE.
... View more
12-06-2016
12:18 AM
Are you also using Ambari 2.4.x? I believe Ambari Infra is shipped wit Ambari 2.4.x, not HDP 2.5. It's sort of like Ambari Metrics is part of Ambari but not directly part of HDP, even though it works with HDP. Assuming you have Ambari 2.4.x, do you don't see "Ambari Infra" when you select Add Service (it does not have the name Solr in it in Ambari)?
... View more
12-05-2016
11:38 PM
@Sami Ahmad Which version of Ambari are you on? Ambari Infra is only in Ambari 2.4.0 and later.
... View more
12-05-2016
10:37 PM
A.so, to install the right version of Solr, I'd recommend this page: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_security/content/installing_external_solr.html
... View more
12-05-2016
10:35 PM
@Sami Ahmad - You've got a couple of things going on. It looks like you are not using the Ambari Infra Solr installation. To use an external Solr, follow these instructions. You'll be using scripts rather than the UI to create the collection. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_security/content/solr_ranger_configure_solrcloud.html The version of Solr you as using may not work with Ambari. You should use 5.2.1 or 5.5.2 to be consistent with what's released with HDP. If you go to 6.x, being a major release, you may encounter significatn issues. Also note that $SOLR/server/lib is not the directory where your config files wold belong with Solr anyway. Normally you'll put them into a "config" directory somewhere. They cannot be in a directory with binary files or you will definitely encounter issues when it tries to load the directory into Zookeeper and then you have a bigger mess to clean up.
... View more
12-01-2016
06:15 PM
@Bilal Arshad Oh, I think I see the problem. Zookeeper already has some data under this configuration. Notice that it says, "re-using existing configuration directory vertex_index". That is probably from your earlier failed attempt when it failed on the jar file (binary files are not allowed). You have three options (maybe more), in this preferred order:
Delete the data using Solr's zkcli.sh command --- ($SOLR_DIR/server/scripts/cloud-scripts/zkcli -zkhost localhost:9983 -cmd delete /config/vertext_index). Now you can create the collection -- I have not used this command in a while, but I think that is correct syntax. If this does not work, I'd just go for #2 since you don't have anything data in Solr to worry about.
Delete ALL Zookeeper data --- Shutdown Solr. Delete ALL zookeeper data (find the directory Delete the zoo_data directory. Restart solr and create the collection.
Leave the bad files in Zookeeper and use a different configName --- Add -n vertext_index2 to your command. This creates a new directory for configs. By default the configName is the same as your collection.
... View more
12-01-2016
05:16 PM
@Bilal Arshad - It looks like your ..../solr-5.5.1/conf directory does not contain all of the config files. Make sure it contains solrconfig.xml along with schema.xml and the other config files for Solr. If that is not the case, then I suppose there could be a file/directory permission issue. You should have all of these files in your conf directory: ├── currency.xml
├── lang
│ └── stopwords_en.txt
├── protwords.txt
├── schema.xml
├── solrconfig.xml
├── stopwords.txt
└── synonyms.txt
... View more
12-01-2016
04:51 PM
@Bilal Arshad - This should work. Here is what I did. Build: export MAVEN_OPTS="-Xmx1536m -XX:MaxPermSize=512m" && mvn clean install I did not care about packaging above (mvn clean package -Pdist,external-hbase-solr ) but it should still work the same. Now find the config files: $ find . -name schema.xml
./distro/src/conf/solr/schema.xml
./distro/target/conf/solr/schema.xml
$ ls ./distro/src/conf/solr/
currency.xml lang protwords.txt schema.xml solrconfig.xml stopwords.txt synonyms.txt
Now create the collection: $ solr create -c testAtlas -d /home/me/atlas/distro/target/conf/solr -shards 1 -replicationFactor 1 -p 8984
Connecting to ZooKeeper at localhost:9984
Uploading /home/me/atlas/distro/target/conf/solr for config testAtlas to ZooKeeper at localhost:9984
Creating new collection 'testAtlas' using command:
http://HOSTNAME:8984/solr/admin/collections?action=CREATE&name=testAtlas&numShards=1&replicationFactor=1&maxShardsPerNode=1&collection.configName=testAtlas
{
"responseHeader":{
"status":0,
"QTime":2691},
"success":{"":{
"responseHeader":{
"status":0,
"QTime":2505},
"core":"testAtlas_shard1_replica1"}}}
... View more
11-30-2016
10:33 PM
Your SOLR_CONF is pointing to the wrong directory. The directory you pointed to contains all of Solr including Jars, etc. You need to point to a directory that contains your conf directory. I'm short on time so I can't look for it now, but I it should be a directory with the file solrconfig.xml and schema.xml. Let me know if that helps or if you need more info.
... View more
11-30-2016
06:53 PM
I'm glad it worked! You might want to post something on HCC about your Hive issue. However, the way to figure out what is wrong is to look carefully at the logs. Of course, you can see Some logs in Ambari from the restart, but also, go look in Ambari to figure out which sub-component is not running (or all). Then go look on that host at /var/log for the component's logs. For example, /var/log/hive/ or /var/log/hive-catalog. Do an ls -ltr on the directory and find at all of the most recent files (both ending in .log and .out). Look carefully near the bottom of those files for errors and clues. Good luck.
... View more