Member since
12-02-2016
4
Posts
0
Kudos Received
0
Solutions
07-11-2017
01:32 PM
Phoenix Hive integration is there from Phoenix 4.8, if you are trying to create phoenix table using hive then that is the way. But if you need to create a phoenix table on top of hbase, as @Rajeshbabu Chintaguntla said you don't need to use phoenix storage handler. And passing the table properties is not required.
... View more
07-04-2017
02:36 PM
Query Timeout exception, when running phoenix queries using JDBC HDP Version : 2.6.1 Pheonix : 4.7.0 (Inbuilt in HDP 2.6.1) We have one phoenix table which is taking approximately 95 sec to do sum/count , issue is same in sqlline and Java program (JDBC ) This is what we did, 1. Increased phoenix query timeout to 3 mins 2. Increased hbase rpc time out to 3 mins After the above change queries are running fine in sqlline, (this is how I know it's taking 95sec app), but the issue is still the same when I run using a Java program. I have tried what @Josh Elser said and it worked. Now the challenge is we wouldn't want the properties to be overwritten from the client side. It will be helpful if we can find a way from where phoenix is getting these default properties, so that we can push the updated config files. Any help much appreciated...!, missed any details please let me know. Exception in thread "main" org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=36, exceptions:
Tue Jul 04 15:24:44 BST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60305: row ' on table 'tablexyz' at region=tablexyz,\x00\x80\x00\x01R\xCF\x89|@\x00\x00\x00\x004103\x001645\x001\x008,1499089548608.14418aca90d9c564e131beda51afb205., hostname=abc.mn.ak.corp,16020,1499177694345, seqNum=1904864
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:771)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:714)
at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50)
at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97)
at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117)
at org.apache.phoenix.iterate.BaseGroupedAggregatingResultIterator.next(BaseGroupedAggregatingResultIterator.java:64)
at org.apache.phoenix.iterate.UngroupedAggregatingResultIterator.next(UngroupedAggregatingResultIterator.java:39)
at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:778)
at com.test.PhoenixJDBC.main(PhoenixJDBC.java:30)
Caused by: java.util.concurrent.ExecutionException: org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=36, exceptions:
Tue Jul 04 15:24:44 BST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60305: row 'n table 'tablexyz' at region=tablexyz,\x00\x80\x00\x01R\xCF\x89|@\x00\x00\x00\x004103\x001645\x001\x008,1499089548608.14418aca90d9c564e131beda51afb205., hostname=abc.mn.ak.corp,16020,1499177694345, seqNum=1904864
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:766)
... 8 more
Caused by: org.apache.phoenix.exception.PhoenixIOException: Failed after attempts=36, exceptions:
Tue Jul 04 15:24:44 BST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60305: row 'on table 'tablexyz' at region=tablexyz,\x00\x80\x00\x01R\xCF\x89|@\x00\x00\x00\x004103\x001645\x001\x008,1499089548608.14418aca90d9c564e131beda51afb205., hostname=abc.mn.ak.corp,16020,1499177694345, seqNum=1904864
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)
at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:203)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:108)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:103)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Tue Jul 04 15:24:44 BST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=60305: row 'on table 'tablexyz' at region=tablexyz,\x00\x80\x00\x01R\xCF\x89|@\x00\x00\x00\x004103\x001645\x001\x008,1499089548608.14418aca90d9c564e131beda51afb205., hostname=abc.mn.ak.corp,16020,1499177694345, seqNum=1904864
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:271)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:210)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:199)
... 7 more
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=60305: row 'on table 'tablexyz' at region=tablexyz,\x00\x80\x00\x01R\xCF\x89|@\x00\x00\x00\x004103\x001645\x001\x008,1499089548608.14418aca90d9c564e131beda51afb205., hostname=abc.mn.ak.corp,16020,1499177694345, seqNum=1904864
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65)
... 3 more
Caused by: java.io.IOException: Call to abc.mn.ak.corp/877.445.27.87:16020 failed on local exception: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call id=70, waitTime=60001, operationTimeout=60000 expired.
at org.apache.hadoop.hbase.ipc.RpcClientImpl.wrapException(RpcClientImpl.java:1261)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1229)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32831)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:379)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:201)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:63)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:364)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:338)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
... 4 more
Caused by: org.apache.hadoop.hbase.ipc.CallTimeoutException: Call id=70, waitTime=60001, operationTimeout=60000 expired.
at org.apache.hadoop.hbase.ipc.Call.checkAndSetTimeout(Call.java:70)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1203)
... 14 more
... View more
12-18-2015
11:46 AM
Thank you very much @Guilherme Braccialli I will try to implement this logic Will let you know updates
... View more
12-17-2015
12:08 PM
Hi All, I have a scenario where I have to do sum(column x) for month range and sum (column x) for a range of year and insert both the records into the next table, listing with example example: sale_id,saleValue,date 1,1000,2015/12/14 2,2000,2015/11/01 3,3000,2015/12/01 4,4000,2015/01/01 Here when we try for this month the output is sum(salevalue) for id's 1 and 3, and for last year it's sum(salevalue) for id's 1,2,3 and 4 And I have to insert both the values in to next table Appreciate your help
... View more
Labels:
- Labels:
-
Apache Hive