- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Time out in creating table in Phoenix based on table in HBase
- Labels:
-
Apache HBase
-
Apache Phoenix
Created ‎04-27-2018 08:38 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm running sandbox 2.6 and I'm trying to create a table in Phoenix based on a HBase table using the following code:
create table "eco_2015_2016_3" ("row" VARCHAR primary key,"year"."year" VARCHAR,"year"."year_title" VARCHAR, "month"."month" VARCHAR,"month"."month_title" VARCHAR, "company"."company" VARCHAR,"company"."company_title" VARCHAR, "account"."account" VARCHAR,"account"."account_title" VARCHAR, "responsibility"."responsibility" VARCHAR,"responsibility"."responsibility_title" VARCHAR, "function"."function" VARCHAR,"function"."function_title" VARCHAR, "activity"."activity" VARCHAR,"activity"."activity_title" VARCHAR, "project_type"."project_type" VARCHAR,"project_type"."project_type_title" VARCHAR, "project"."project" VARCHAR,"project"."project_title" VARCHAR, "object"."object" VARCHAR,"object"."object_title" VARCHAR, "counterpart"."counterpart" VARCHAR,"counterpart"."counterpart_title" VARCHAR, "free"."free" VARCHAR,"free"."free_title" VARCHAR, "amount"."amount" VARCHAR);
and I get the following error:
org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: Failed to get result within timeout, timeout=60000ms
The same code works for a smaller amount of data so I guess that it times out because of the large data set. I have increased the HBase RPC Timeout to 2 minutes and 30 seconds and the Phoenix query timeout to 600 000 ms so neither of these seem to be cause of the time out (since it says 60 000 ms). I found this post about the things that needs to be set to not get a time out https://community.hortonworks.com/content/supportkb/49037/phoenix-sqlline-query-on-larger-data-set-f... but I don't get how to find these properties in Ambari using the sandbox. Would really appreciate it if someone could explain that to me.
Thanks,
//Rebecca
Added the full error:
org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: Failed to get result within timeout, timeout=60000ms at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111) at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:771) at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:714) at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50) at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97) at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117) at org.apache.phoenix.iterate.BaseGroupedAggregatingResultIterator.next(BaseGroupedAggregatingRe sultIterator.java:64) at org.apache.phoenix.iterate.UngroupedAggregatingResultIterator.next(UngroupedAggregatingResult Iterator.java:39) at org.apache.phoenix.compile.PostDDLCompiler$2.execute(PostDDLCompiler.java:285) at org.apache.phoenix.query.ConnectionQueryServicesImpl.updateData(ConnectionQueryServicesImpl.j ava:2823) at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:882) at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329) at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1440) at sqlline.Commands.execute(Commands.java:822) at sqlline.Commands.sql(Commands.java:732) at sqlline.SqlLine.dispatch(SqlLine.java:808) at sqlline.SqlLine.begin(SqlLine.java:681) at sqlline.SqlLine.start(SqlLine.java:398) at sqlline.SqlLine.main(SqlLine.java:292) Caused by: java.util.concurrent.ExecutionException: org.apache.phoenix.exception.PhoenixIOException: Fai led to get result within timeout, timeout=60000ms at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:206) at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:766) ... 21 more Caused by: org.apache.phoenix.exception.PhoenixIOException: Failed to get result within timeout, timeout =60000ms at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111) at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:203) at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:108) at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:103) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: Failed to get result within timeout, timeout=60000ms at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.j ava:206) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.j ava:60) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:20 0) at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:326) at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:301) at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.ja va:166) at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:161) at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794) at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:199) ... 7 more
Created ‎04-27-2018 08:45 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Please share the full stacktrace of error which is populated after increasing timeout.
Created ‎04-27-2018 09:13 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@schhabra Here is the full error:
org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: Failed to get result within timeout, timeout=60000ms at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111) at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:771) at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:714) at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50) at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97) at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117) at org.apache.phoenix.iterate.BaseGroupedAggregatingResultIterator.next(BaseGroupedAggregatingRe sultIterator.java:64) at org.apache.phoenix.iterate.UngroupedAggregatingResultIterator.next(UngroupedAggregatingResult Iterator.java:39) at org.apache.phoenix.compile.PostDDLCompiler$2.execute(PostDDLCompiler.java:285) at org.apache.phoenix.query.ConnectionQueryServicesImpl.updateData(ConnectionQueryServicesImpl.j ava:2823) at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:882) at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329) at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1440) at sqlline.Commands.execute(Commands.java:822) at sqlline.Commands.sql(Commands.java:732) at sqlline.SqlLine.dispatch(SqlLine.java:808) at sqlline.SqlLine.begin(SqlLine.java:681) at sqlline.SqlLine.start(SqlLine.java:398) at sqlline.SqlLine.main(SqlLine.java:292) Caused by: java.util.concurrent.ExecutionException: org.apache.phoenix.exception.PhoenixIOException: Fai led to get result within timeout, timeout=60000ms at java.util.concurrent.FutureTask.report(FutureTask.java:122) at java.util.concurrent.FutureTask.get(FutureTask.java:206) at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:766) ... 21 more Caused by: org.apache.phoenix.exception.PhoenixIOException: Failed to get result within timeout, timeout =60000ms at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111) at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:203) at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:108) at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:103) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: Failed to get result within timeout, timeout=60000ms at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.j ava:206) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.j ava:60) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:20 0) at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:326) at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:301) at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.ja va:166) at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:161) at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794) at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:199) ... 7 more
Created ‎04-27-2018 09:19 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Current error shows query is failing while scan.
Please try increasing values for below properties.
hbase.rpc.timeout
phoenix.query.timeoutMs
hbase.client.scanner.timeout.period
If properties are not present in hbase-site.xml, add them.
Default value of these parameters is 60000ms.
