Member since
01-09-2015
16
Posts
2
Kudos Received
0
Solutions
04-02-2015
12:51 AM
We have just upgraded our cluster to the latest version of CDH, now it is CDH 5.3.2 and we're using SolrJ version 4.4.0-cdh5.3.2 like: <dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>4.4.0-cdh5.3.2</version>
</dependency> And the same Java code as above, and we still get No live SolrServers available to handle this request exception. What should we do to send a document to Cloud Solr server via ZooKeeper?
... View more
03-31-2015
05:54 AM
1 Kudo
Hello, I'm using a CDH 5.3.0 cluster, and having problems when trying to use SolrJ (version 4.4.0-cdh5.3.0) to insert a document into Solr. According to the documentation, I can use code like the following to interact with ZooKeeper controlled SolrCloud server: import org.apache.solr.client.solrj.SolrServerException;
import org.apache.solr.client.solrj.impl.CloudSolrServer;
import org.apache.solr.common.SolrInputDocument;
import java.io.IOException;
public class standaloneCloud {
public static void main(String [] args) throws IOException, SolrServerException {
String myId = "myID";
String myUri = "this is for sure something usefull and stuff!";
String myLangCode = "ru";
String myType = "my_type!";
String zkHostString = "localhost:2181/solr"; // SSH tunnelled to ZooKeeper
CloudSolrServer server = new CloudSolrServer(zkHostString);
server.setDefaultCollection("collection3");
SolrInputDocument doc = new SolrInputDocument();
doc.addField("id", myId + "___" + "some keyword" );
doc.addField("uri", myUri );
doc.addField("alias", "some keyword" );
doc.addField("langCode", myLangCode );
doc.addField("type", myType);
server.add(doc);
server.commit();
}
} But when I run it, it gives the following exception: Exception in thread "main" org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request
at org.apache.solr.client.solrj.impl.LBHttpSolrServer.request(LBHttpSolrServer.java:289)
at org.apache.solr.client.solrj.impl.CloudSolrServer.request(CloudSolrServer.java:310)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:117)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:116)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:102)
at standaloneCloud.main(standaloneCloud.java:31)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140) I don't understand why it complains about no live Solr servers, because there are live Solr servers, there's a ZooKeeper at the given host:port, and I also checked that it returns the address of a valid Solr server on the cluster. To prove it, on the same cluster, I compile and run the following code that is very similar, only difference being the use of HttpSolrServer directly, instead of CloudSolrServer: import org.apache.solr.client.solrj.impl.HttpSolrServer;
import org.apache.solr.common.SolrInputDocument;
public class standalone {
public static void main(String [] args) {
try {
String myId = "myID";
String myUri = "this is for sure something usefull and stuff!";
String myLangCode = "ru";
String myType = "my_type!";
String solrString ="http://node04.demo.hadoop:8983/solr/collection3";
HttpSolrServer server = new HttpSolrServer(solrString);
SolrInputDocument doc = new SolrInputDocument();
doc.addField("id", myId + "___" + "some keyword" );
doc.addField("uri", myUri );
doc.addField("alias", "some keyword" );
doc.addField("langCode", myLangCode );
doc.addField("type", myType );
server.add(doc);
server.commit();
} catch (Exception e) {
System.out.println(e);
}
}
} The code above works perfectly fine and inserts the document to Solr to be indexed without any exceptions. So, why can't I use CloudSolrServer and send my document via ZooKeeper when I'm using SolrJ?
... View more
Labels:
03-31-2015
12:57 AM
1 Kudo
Hello, According to http://www.cloudera.com/content/cloudera/en/training/certification/ccp-ds.html the whole procedure for being a Cloudera Certified Data Scientist has changed. But an important question has not been answered: As a person who passed the DS-200 Data Science Essentials test, but did not complete a practical challenge, what am I supposed to do? I can read that there are 3 new exams announced but Cloudera needs to take into account the fact that people like me: Paid for and attended the Data Science training. Studied hard for DS-200 exam. Paid for, took and passed the DS-200 exam. Is the Cloudera's plan to tell us something like: "Forget about the above. Wait for the new exams, and then we'll see."? I think an announcement (without required details) is not enough; Cloudera needs to make an explanation, including a justification. Kind regards.
... View more
Labels:
- Labels:
-
Certification
-
Training
01-09-2015
07:16 AM
Brad, Thank you very much, problem solved!
... View more
01-09-2015
06:56 AM
Hello Brad, Sorry, the problem persists. I clicked on the "Sign Out" link on the top right (next to my username), then logged in again, then tried to view the contents of the Data Science Challenge 3 but I cannot, still "Access Denied". Then I've tried signing out again. Still the same problem 😞
... View more
01-09-2015
06:39 AM
Hello,
I can't view the contents of the sub-forum: CCP: Data Scientist Challenge 3.
But I've already registered for CCP Data Science Challenge 3 (paid for it), received the description, started to examine the data sets, etc. When I try to view the forum it says:
"You do not have sufficient privileges for this resource or its parent to perform this action.
Click your browser's Back button to continue."
I have used a different e-mail address for Challenge registration (different from the one I used to register for community.cloudera.com), and I think this might be the problem, but I'm not sure.
Can one of the administrator check this and help me?
Kind regards,
Emre
... View more
Labels:
- Labels:
-
Certification