Support Questions
Find answers, ask questions, and share your expertise

HBase 1.0.0-cdh5.6.0 Failed to get region location

Highlighted

HBase 1.0.0-cdh5.6.0 Failed to get region location

New Contributor

I'm using hbase1.0.0-cdh5.6.0 with thrift2. Today the thrift2 getting the error like this:

 

16/04/13 10:08:02 ERROR client.AsyncProcess: Failed to get region location
java.io.IOException: hconnection-0x16188a0c to test is closed
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1116)
        at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:369)
        at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:320)
        at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:206)
        at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
        at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1513)
        at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1119)
        at org.apache.hadoop.hbase.client.HTablePool$PooledHTable.put(HTablePool.java:449)
        at org.apache.hadoop.hbase.thrift2.ThriftHBaseServiceHandler.putMultiple(ThriftHBaseServiceHandler.java:271)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hbase.thrift2.ThriftHBaseServiceHandler$THBaseServiceMetricsProxy.invoke(ThriftHBaseServiceHandler.java:104)
        at com.sun.proxy.$Proxy11.putMultiple(Unknown Source)
        at org.apache.hadoop.hbase.thrift2.generated.THBaseService$Processor$putMultiple.getResult(THBaseService.java:1486)
        at org.apache.hadoop.hbase.thrift2.generated.THBaseService$Processor$putMultiple.getResult(THBaseService.java:1470)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)

It's seems that HBASE-14196 had fixed the bug, and as hbase-1.0.0-cdh5.6.0-changes.log says, HBase1.0.0-cdh5.6.6 had included the patch. It's strange still get this error.

4 REPLIES 4
Highlighted

Re: HBase 1.0.0-cdh5.6.0 Failed to get region location

Expert Contributor

looks like https://issues.apache.org/jira/browse/HBASE-14533 (related in HBASE-14196) is where they are working on the issue, not sure that is completely fixed based on the conversation there. 

Highlighted

Re: HBase 1.0.0-cdh5.6.0 Failed to get region location

Rising Star
Both HBASE-14533 and HBASE-14196 are being included in CDH 5.6.

Can you upload 1) the client log, 2) thriftserver log, and 3) the result of 'hbase version' command in your console?

Dice.

Re: HBase 1.0.0-cdh5.6.0 Failed to get region location

New Contributor

thrift2 log:

 

16/04/13 10:17:33 ERROR client.AsyncProcess: Failed to get region location
java.io.IOException: hconnection-0x16188a0c to test_table is closed
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1116)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:369)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:320)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:206)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1513)
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1119)
at org.apache.hadoop.hbase.client.HTablePool$PooledHTable.put(HTablePool.java:449)
at org.apache.hadoop.hbase.thrift2.ThriftHBaseServiceHandler.putMultiple(ThriftHBaseServiceHandler.java:271)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hbase.thrift2.ThriftHBaseServiceHandler$THBaseServiceMetricsProxy.invoke(ThriftHBaseServiceHandler.java:104)
at com.sun.proxy.$Proxy11.putMultiple(Unknown Source)
at org.apache.hadoop.hbase.thrift2.generated.THBaseService$Processor$putMultiple.getResult(THBaseService.java:1486)
at org.apache.hadoop.hbase.thrift2.generated.THBaseService$Processor$putMultiple.getResult(THBaseService.java:1470)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
16/04/13 10:17:33 ERROR client.AsyncProcess: Failed to get region location

 

regionserver log:

 

2016-04-13 10:17:32,183 WARN org.apache.hadoop.hdfs.DFSClient: DFS chooseDataNode: got # 1 IOException, will wait for 2960.3654391682903 msec.
2016-04-13 10:17:35,147 WARN org.apache.hadoop.hdfs.BlockReaderFactory: I/O error constructing remote block reader.
at org.apache.hadoop.hdfs.RemoteBlockReader2.checkSuccess(RemoteBlockReader2.java:467)
at org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.java:432)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFactory.java:879)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:757)
at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:374)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:624)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:851)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:903)
at java.io.DataInputStream.read(DataInputStream.java:149)
at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:199)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1364)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1640)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1467)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:430)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.seekTo(HFileReaderV2.java:1164)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:254)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:156)
at org.apache.hadoop.hbase.regionserver.StoreScanner.seekScanners(StoreScanner.java:363)
at org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:217)
at org.apache.hadoop.hbase.regionserver.HStore.createScanner(HStore.java:2033)
at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2023)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:5138)
at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:2457)
at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2443)
at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2420)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2154)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32205)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2034)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
at org.apache.hadoop.hdfs.RemoteBlockReader2.checkSuccess(RemoteBlockReader2.java:467)
at org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.java:432)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFactory.java:879)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:757)

at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:374)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:624)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:851)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:903)
at java.io.DataInputStream.read(DataInputStream.java:149)
at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:199)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1364)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1640)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1467)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:430)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.seekTo(HFileReaderV2.java:1164)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:254)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:156)
at org.apache.hadoop.hbase.regionserver.StoreScanner.seekScanners(StoreScanner.java:363)
at org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:217)
at org.apache.hadoop.hbase.regionserver.HStore.createScanner(HStore.java:2033)
at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2023)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:5138)
at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:2457)
at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2443)
at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2420)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2154)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32205)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2034)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
2016-04-13 10:17:35,148 WARN org.apache.hadoop.hdfs.BlockReaderFactory: I/O error constructing remote block reader.
at org.apache.hadoop.hdfs.RemoteBlockReader2.checkSuccess(RemoteBlockReader2.java:467)
at org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.java:432)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFactory.java:879)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:757)
at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:374)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:624)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:851)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:903)

at java.io.DataInputStream.read(DataInputStream.java:149)
at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:199)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1364)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1640)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1467)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:430)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.seekTo(HFileReaderV2.java:1164)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:254)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:156)
at org.apache.hadoop.hbase.regionserver.StoreScanner.seekScanners(StoreScanner.java:363)
at org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:217)
at org.apache.hadoop.hbase.regionserver.HStore.createScanner(HStore.java:2033)
at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2023)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:5138)
at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:2457)
at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2443)
at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2420)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2154)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32205)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2034)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
at org.apache.hadoop.hdfs.RemoteBlockReader2.checkSuccess(RemoteBlockReader2.java:467)
at org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.java:432)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFactory.java:879)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:757)
at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:374)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:624)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:851)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:903)
at java.io.DataInputStream.read(DataInputStream.java:149)
at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:199)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.readAtOffset(HFileBlock.java:1364)
at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockDataInternal(HFileBlock.java:1640)

at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderImpl.readBlockData(HFileBlock.java:1467)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:430)
at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$EncodedScannerV2.seekTo(HFileReaderV2.java:1164)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:254)
at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:156)
at org.apache.hadoop.hbase.regionserver.StoreScanner.seekScanners(StoreScanner.java:363)
at org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:217)
at org.apache.hadoop.hbase.regionserver.HStore.createScanner(HStore.java:2033)
at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2023)
at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:5138)
at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:2457)
at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2443)
at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2420)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2154)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32205)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2034)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
2016-04-13 10:17:35,149 INFO org.apache.hadoop.hdfs.DFSClient: Could not obtain BP-1525044990-10.6.24.105-1457168824180:blk_1075520132_1779310 from any node: java.io.IOException: No live nodes contain block BP-1525044990-10.6.24.105-1457168824180:blk_1075520132_1779310 after checking nodes = [DatanodeInfoWithStorage[10.6.24.106:50010,DS-15dc6a3b-af74-4823-8d01-271eb6a7f516,DISK], DatanodeInfoWithStorage[10.6.24.107:50010,DS-ddcf4b37-bba9-4ddf-8e9f-8752cbbdca06,DISK], DatanodeInfoWithStorage[10.6.24.111:50010,DS-d42c437b-807e-4a3a-ad9f-6d5818b4d06b,DISK]], ignoredNodes = null No live nodes contain current block Block locations: DatanodeInfoWithStorage[10.6.24.106:50010,DS-15dc6a3b-af74-4823-8d01-271eb6a7f516,DISK] DatanodeInfoWithStorage[10.6.24.107:50010,DS-ddcf4b37-bba9-4ddf-8e9f-8752cbbdca06,DISK] DatanodeInfoWithStorage[10.6.24.111:50010,DS-d42c437b-807e-4a3a-ad9f-6d5818b4d06b,DISK] Dead nodes: DatanodeInfoWithStorage[10.6.24.106:50010,DS-15dc6a3b-af74-4823-8d01-271eb6a7f516,DISK] DatanodeInfoWithStorage[10.6.24.111:50010,DS-d42c437b-807e-4a3a-ad9f-6d5818b4d06b,DISK] DatanodeInfoWithStorage[10.6.24.107:50010,DS-ddcf4b37-bba9-4ddf-8e9f-8752cbbdca06,DISK]. Will get new block locations from namenode and retry...

2016-04-13 10:17:35,149 WARN org.apache.hadoop.hdfs.DFSClient: DFS chooseDataNode: got # 1 IOException, will wait for 1804.3907068280005 msec.

 

client(golang) log:

 

  query hbase:

TIOError({Message:0xc820a481d0})

 

  reconnect hbase:

Reconnect failed:dial tcp 10.6.24.107:9090: getsockopt: connection refused

 

  

# hbase version
16/04/13 15:00:50 INFO util.VersionInfo: HBase 1.0.0-cdh5.6.0
16/04/13 15:00:50 INFO util.VersionInfo: Source code repository file:///data/jenkins/workspace/generic-package-rhel64-6-0/topdir/BUILD/hbase-1.0.0-cdh5.6.0 revision=Unknown
16/04/13 15:00:50 INFO util.VersionInfo: Compiled by jenkins on Thu Jan 28 21:45:49 PST 2016
16/04/13 15:00:50 INFO util.VersionInfo: From source with checksum d5976c0290dd50b1259b68203fec9ccb

 

 

 

Highlighted

Re: HBase 1.0.0-cdh5.6.0 Failed to get region location

New Contributor

Hi Dice,

 

I am using HBase version - 1.2.0-cdh5.8.3

I am getting similar error. Can you please take a look?


@dice wrote:
Both HBASE-14533 and HBASE-14196 are being included in CDH 5.6.

Can you upload 1) the client log, 2) thriftserver log, and 3) the result of 'hbase version' command in your console?

Dice.

2017-02-27 12:47:28,123 ERROR [main] org.apache.hadoop.hbase.client.AsyncProcess: Failed to get region location
org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to XXXXX:60020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to XXXX:60020 is closing. Call id=34, waitTime=4
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.wrapException(AbstractRpcClient.java:289)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1272)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.get(ClientProtos.java:34070)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1589)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1398)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1199)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:395)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:344)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:230)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:146)
at org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:113)
at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:138)
at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:94)
at org.apache.pig.backend.hadoop.hbase.HBaseStorage.putNext(HBaseStorage.java:931)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:98)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:558)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:105)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.runPipeline(PigGenericMapReduce.java:467)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.processOnePackageOutput(PigGenericMapReduce.java:432)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.reduce(PigGenericMapReduce.java:412)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.reduce(PigGenericMapReduce.java:256)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1776)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to XXXX:60020 is closing. Call id=34, waitTime=4
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.cleanupCalls(RpcClientImpl.java:1084)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.close(RpcClientImpl.java:863)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.run(RpcClientImpl.java:580)