Member since
04-18-2018
6
Posts
0
Kudos Received
0
Solutions
04-27-2018
09:13 AM
@schhabra Here is the full error: org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: Failed
to get result within timeout, timeout=60000ms
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:771)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:714)
at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50)
at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97)
at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117)
at org.apache.phoenix.iterate.BaseGroupedAggregatingResultIterator.next(BaseGroupedAggregatingRe
sultIterator.java:64)
at org.apache.phoenix.iterate.UngroupedAggregatingResultIterator.next(UngroupedAggregatingResult
Iterator.java:39)
at org.apache.phoenix.compile.PostDDLCompiler$2.execute(PostDDLCompiler.java:285)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.updateData(ConnectionQueryServicesImpl.j
ava:2823)
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:882)
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)
at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1440)
at sqlline.Commands.execute(Commands.java:822)
at sqlline.Commands.sql(Commands.java:732)
at sqlline.SqlLine.dispatch(SqlLine.java:808)
at sqlline.SqlLine.begin(SqlLine.java:681)
at sqlline.SqlLine.start(SqlLine.java:398)
at sqlline.SqlLine.main(SqlLine.java:292)
Caused by: java.util.concurrent.ExecutionException: org.apache.phoenix.exception.PhoenixIOException: Fai
led to get result within timeout, timeout=60000ms
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:766)
... 21 more
Caused by: org.apache.phoenix.exception.PhoenixIOException: Failed to get result within timeout, timeout
=60000ms
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)
at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:203)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:108)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:103)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Failed to get result within timeout, timeout=60000ms
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.j
ava:206)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.j
ava:60)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:20
0)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:326)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:301)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.ja
va:166)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:161)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:199)
... 7 more
... View more
04-27-2018
08:38 AM
Hi, I'm running sandbox 2.6 and I'm trying to create a table in Phoenix based on a HBase table using the following code: create table "eco_2015_2016_3" ("row" VARCHAR primary key,"year"."year" VARCHAR,"year"."year_title" VARCHAR,
"month"."month" VARCHAR,"month"."month_title" VARCHAR,
"company"."company" VARCHAR,"company"."company_title" VARCHAR,
"account"."account" VARCHAR,"account"."account_title" VARCHAR,
"responsibility"."responsibility" VARCHAR,"responsibility"."responsibility_title" VARCHAR,
"function"."function" VARCHAR,"function"."function_title" VARCHAR,
"activity"."activity" VARCHAR,"activity"."activity_title" VARCHAR,
"project_type"."project_type" VARCHAR,"project_type"."project_type_title" VARCHAR,
"project"."project" VARCHAR,"project"."project_title" VARCHAR,
"object"."object" VARCHAR,"object"."object_title" VARCHAR,
"counterpart"."counterpart" VARCHAR,"counterpart"."counterpart_title" VARCHAR,
"free"."free" VARCHAR,"free"."free_title" VARCHAR,
"amount"."amount" VARCHAR); and I get the following error: org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: Failed
to get result within timeout, timeout=60000ms The same code works for a smaller amount of data so I guess that it times out because of the large data set. I have increased the HBase RPC Timeout to 2 minutes and 30 seconds and the Phoenix query timeout to 600 000 ms so neither of these seem to be cause of the time out (since it says 60 000 ms). I found this post about the things that needs to be set to not get a time out https://community.hortonworks.com/content/supportkb/49037/phoenix-sqlline-query-on-larger-data-set-fails-wit.html but I don't get how to find these properties in Ambari using the sandbox. Would really appreciate it if someone could explain that to me. Thanks, //Rebecca Added the full error: org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: Failed
to get result within timeout, timeout=60000ms
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:771)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:714)
at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50)
at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97)
at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117)
at org.apache.phoenix.iterate.BaseGroupedAggregatingResultIterator.next(BaseGroupedAggregatingRe
sultIterator.java:64)
at org.apache.phoenix.iterate.UngroupedAggregatingResultIterator.next(UngroupedAggregatingResult
Iterator.java:39)
at org.apache.phoenix.compile.PostDDLCompiler$2.execute(PostDDLCompiler.java:285)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.updateData(ConnectionQueryServicesImpl.j
ava:2823)
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:882)
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)
at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)
at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)
at org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1440)
at sqlline.Commands.execute(Commands.java:822)
at sqlline.Commands.sql(Commands.java:732)
at sqlline.SqlLine.dispatch(SqlLine.java:808)
at sqlline.SqlLine.begin(SqlLine.java:681)
at sqlline.SqlLine.start(SqlLine.java:398)
at sqlline.SqlLine.main(SqlLine.java:292)
Caused by: java.util.concurrent.ExecutionException: org.apache.phoenix.exception.PhoenixIOException: Fai
led to get result within timeout, timeout=60000ms
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:766)
... 21 more
Caused by: org.apache.phoenix.exception.PhoenixIOException: Failed to get result within timeout, timeout
=60000ms
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)
at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:203)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:108)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:103)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:183)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Failed to get result within timeout, timeout=60000ms
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.j
ava:206)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.j
ava:60)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:20
0)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:326)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:301)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.ja
va:166)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:161)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
at org.apache.phoenix.iterate.TableResultIterator.initScanner(TableResultIterator.java:199)
... 7 more
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
04-20-2018
08:36 AM
Found were I could download sandbox 2.5. Will download 2.5 instead. @gdeleon.
... View more
04-20-2018
08:32 AM
Thank you for your answer @gdeleon. Is there any way to get to 2.5 sandbox instead? I wanted to use HBase for a case-study I'm doing and it would be great to ge able to run the tutorial. I did manage to solve the issue by changing the address to the hdfs to hdfs://sandbox-hdp.hortonworks.com:/tmp/data.csv so maybe I'll be able to work around the reset of the changes too but it would probably be easier if I could use the older version of the sandbox.
... View more
04-19-2018
08:50 PM
I solved this by changing the address to the hdfs to hdfs://sandbox-hdp.hortonworks.com:/tmp/data.csv hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=, -Dimporttsv.columns="HBASE_ROW_KEY,events:driverId,events:driverName,events:eventTime,events:eventType,events:latitudeColumn,events:longitudeColumn,events:routeId,events:routeName,events:truckId" driver_dangerous_event hdfs://sandbox-hdp.hortonworks.com:/tmp/data.csv
... View more
04-19-2018
08:50 PM
Hi, I have installed the hortonworks sandbox version --- and have followed this tutorial https://hortonworks.com/tutorial/learning-the-ropes-of-the-hortonworks-sandbox/ and everything seems to be up and running. I want to use HBase and therfore followed this tutorial https://hortonworks.com/hadoop-tutorial/introduction-apache-hbase-concepts-apache-phoenix-new-backup-restore-utility-hbase/#section_1 but when I try to import data into Hbase using ImportTsv using the following statement: <code>hbase org.apache.hadoop.hbase.mapreduce.ImportTsv-Dimporttsv.separator=,-Dimporttsv.columns="HBASE_ROW_KEY,events:driverId,events:driverName,events:eventTime,events:eventType,events:latitudeColumn,events:longitudeColumn,events:routeId,events:routeName,events:truckId" driver_dangerous_event hdfs://sandbox.hortonworks.com:/tmp/data.csv I get the following error: Exception in thread "main" java.net.ConnectException: Call From sandbox-hdp.hortonworks.com/172.17.0.2 t
o sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection ref
used; For more details see: <a href="http://wiki.apache.org/hadoop/ConnectionRefused">http://wiki.apache.org/hadoop/ConnectionRefused</a>
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
ava:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558)
at org.apache.hadoop.ipc.Client.call(Client.java:1498)
at org.apache.hadoop.ipc.Client.call(Client.java:1398)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNameno
deProtocolTranslatorPB.java:823)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:29
1)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2165)
at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1442)
at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1438)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1454)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1715)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputForma
t.java:294)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:265)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:387)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:318)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at org.apache.hadoop.hbase.mapreduce.ImportTsv.run(ImportTsv.java:721)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.hbase.mapreduce.ImportTsv.main(ImportTsv.java:725)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:650)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:745)
at org.apache.hadoop.ipc.Client$Connection.access$3200(Client.java:397)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1620)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
... 36 more
What I understand from this is that I don't seem to be able to connect to the namenode (sandbox.hortonworks.com:8020), but everything seem to be up and running. I can copy data from my local machine to the hdfs using this command: <code>hadoop fs -copyFromLocal ~/data.csv /tmp So I seem to be able to connect to the hdfs. Would be really grateful if someone could point me in the right direction. //Rebecca
... View more
Labels:
- Labels:
-
Apache HBase