Created 08-05-2016 04:49 AM
Hi,
I 'm beginner in hadoop. I try to create the external table from hive but keep getting error like this :
Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the locations at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:312) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:152) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320) at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295) at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160) at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155) at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794) at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602) at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366) at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:398) at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:408) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:214) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:673) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:666) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:159) at com.sun.proxy.$Proxy12.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:718) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4171) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:307) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1720) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1477) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1254) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1118) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1113) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154) at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:71) at org.apache.hive.service.cli.operation.SQLOperation$1$1.run(SQLOperation.java:206) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hive.service.cli.operation.SQLOperation$1.run(SQLOperation.java:218) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) )
what should I do ? i search about this problem and some guys said that this is because zookeeper used by hive doesn't choose the right zookeeper node for HBASE. is this true ? And how can I fixed this problem ?
Thank you for whoever can help me.
Created 08-05-2016 04:50 AM
Can you please share your DDL? If you are new, why are you creating an HBase table? Why not first play with simple hive tables.
Created 08-05-2016 04:59 AM
CREATE EXTERNAL TABLE Test (rowkeyIDEvent string, agentAddress string,agentDnsDomain string,agentHostName string,agentID string,agentMacAddress string,agentNtDomain string,agentReceiptTime string, agentSeverity string,agentTimeZone string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES('hbase.columns.mapping' = ':key, p2p:agentAddress,p2p:agentDnsDomain,p2p:agentHostName,p2p:agentID,p2p:agentMacAd dress,p2p:agentNtDomain,p2p:agentReceiptTime, p2p:agentSeverity,p2p:agentTimeZone)
TBLPROPERTIES ('hbase.table.name' = 'logs');
is this what u mean ?
Created 08-05-2016 05:58 AM
Created 08-05-2016 05:58 AM
it seems small syntax issue in your script
Created 08-05-2016 07:35 AM
i think my namespace is hbase and the table name is 'logs'
Created 08-05-2016 09:00 AM
Have you tried using 'hbase.logs' instead of 'logs' as your 'hbase.table.name' in the DDL?
Created 08-06-2016 05:47 AM
try this:
CREATE EXTERNAL TABLE Test
(
rowkeyIDEvent string,
agentAddress string,
agentDnsDomain string,
agentHostName string,
agentID string,
agentMacAddress string,
agentNtDomain string,
agentReceiptTime string,
agentSeverity string,
agentTimeZone string
)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES
( "hbase.columns.mapping" = ":key,
p2p:agentAddress,
p2p:agentDnsDomain,
p2p:agentHostName,
p2p:agentID,
p2p:agentMacAddress,
p2p:agentNtDomain,
p2p:agentReceiptTime,
p2p:agentSeverity,
p2p:agentTimeZone
)
TBLPROPERTIES ("hbase.table.name" = "hbase.logs");
Created 08-05-2016 06:10 AM
Hi @Sha J, these are the steps to add HBase zk info into hive:
1) Edit Ambari->Hive->Configs->Advanced->Custom hive-site->Add Property..., add the following properties based on your HBase configurations(you can search in Ambari->HBase->Configs):
2) Restart Hive via Ambari
Created 08-05-2016 07:10 AM
i've try this but still get the same error. :(