Created 08-04-2017 03:44 PM
When I use com.hortonworks.shc-core in scala like this:
spark.write.options(Map(HBaseTableCatalog.tableCatalog -> catalog, HBaseTableCatalog.newTable -> "5", HBaseTableCatalog.avro -> ""))
.format("org.apache.spark.sql.execution.datasources.hbase")
.save()
I have exception:
17/08/04 18:29:46 INFO ClientCnxn: Session establishment complete on server domain.company.com/11.111.111.111:2181, sessionid = 0x15c7820075c1bbd, negotiated timeout = 40000 17/08/04 18:29:46 WARN RpcControllerFactory: Cannot load configured "hbase.rpc.controllerfactory.class" (org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory) from hbase-site.xml, falling back to use default RpcControllerFactory 17/08/04 18:29:46 WARN RpcControllerFactory: Cannot load configured "hbase.rpc.controllerfactory.class" (org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory) from hbase-site.xml, falling back to use default RpcControllerFactory Exception in thread "main" org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the locations at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:312)
Created 08-07-2017 08:41 AM
can you paste the complete stack trace.
Created 08-08-2017 09:02 AM
Do you have a correct value defined for "zookeeper.znode.parent" in hbase-site.xml? or did you specify a good one? normally it should be something like "/hbase-unsecure" or "/hbase-secure" depending if you hav e kerberos or not.
Michel
Created 06-18-2019 09:25 AM
java.sql.SQLException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location for replica 0
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2492)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2384)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2384)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:270)
at com.petpe.util.PhoenixConnection.getConnnection(PhoenixConnection.java:26)
at com.petpe.hbase.utill.ManageAccessMangement.getHbaseUser(ManageAccessMangement.java:363)
at com.petpe.users.action.UsersAction.userLogin(UsersAction.java:541)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
I am getting database connectivity issue and i checked all the configurations and properties,still i am facing this issue.what else i need to do. Can u suggest me????
Thanks in advance.
Created 10-21-2019 10:28 AM
Hi @hachhll
I am having the same issue when I submit a job on cluster, but when I submit in local mode, the job runs fine. If you found the solution, please let us know.
Created 10-21-2019 11:27 AM
Hey,
Can you once review, if you have configured the Hbase Service ( in Hive Service) dependency [1]?
I have come across scenarios where, if the dependency is not configured, then there is a possibility of such error [2] to occur.
[1] https://docs.cloudera.com/documentation/enterprise/5-16-x/topics/cdh_ig_hive_hbase.html
[2] org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the locations
Created 10-21-2019 12:01 PM
Ahh! in pom.xml, it was wrong hbase-site.xml file that I was referencing to.
@gsthina Thanks,
Created 10-22-2019 03:33 AM
Hey @axk , Thanks for letting us know. I'm glad it was helpful 🙂