17/06/13 16:04:08 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=false, overwrite=false, skipCRC=false, blocking=true, numListstatusThreads=0, maxMaps=24, mapBandwidth=1, sslConfigurationFile='null', copyStrategy='uniformsize', preserveStatus=[], preserveRawXattrs=false, atomicWorkPath=null, logPath=null, sourceFileListing=null, sourcePaths=[hdfs://nn1:8020/dir/year=2017/month=05/day=13], targetPath=hdfs://nn2:8020/dir/day=20170513, targetPathExists=false, filtersFile='null'} 17/06/13 16:04:08 INFO client.AHSProxy: Connecting to Application History server at hs/IP.73:10200 17/06/13 16:04:09 INFO tools.SimpleCopyListing: Paths (files+dirs) cnt = 72512; dirCnt = 1 17/06/13 16:04:09 INFO tools.SimpleCopyListing: Build file listing completed. 17/06/13 16:04:10 INFO tools.DistCp: Number of paths in the copy list: 72512 17/06/13 16:04:11 INFO tools.DistCp: Number of paths in the copy list: 72512 17/06/13 16:04:11 INFO client.AHSProxy: Connecting to Application History server at hs/IP.73:10200 17/06/13 16:04:11 INFO client.RequestHedgingRMFailoverProxyProvider: Looking for the active RM in [rm1, rm2]... 17/06/13 16:04:11 INFO client.RequestHedgingRMFailoverProxyProvider: Found active RM [rm1] 17/06/13 16:04:12 INFO mapreduce.JobSubmitter: number of splits:25 17/06/13 16:04:12 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1496319477519_0055 17/06/13 16:04:13 INFO impl.YarnClientImpl: Submitted application application_1496319477519_0055 17/06/13 16:04:13 INFO mapreduce.Job: The url to track the job: http://rm:8088/proxy/application_1496319477519_0055/ 17/06/13 16:04:13 INFO tools.DistCp: DistCp job-id: job_1496319477519_0055 17/06/13 16:04:13 INFO mapreduce.Job: Running job: job_1496319477519_0055 17/06/13 16:04:20 INFO mapreduce.Job: Job job_1496319477519_0055 running in uber mode : false 17/06/13 16:04:20 INFO mapreduce.Job: map 0% reduce 0% 17/06/13 16:04:35 INFO mapreduce.Job: map 1% reduce 0% 17/06/13 16:04:41 INFO mapreduce.Job: map 2% reduce 0% 17/06/13 16:04:44 INFO mapreduce.Job: map 3% reduce 0% 17/06/13 16:04:49 INFO mapreduce.Job: map 4% reduce 0% 17/06/13 16:04:58 INFO mapreduce.Job: map 5% reduce 0% 17/06/13 16:05:58 INFO mapreduce.Job: map 6% reduce 0% 17/06/13 16:06:58 INFO mapreduce.Job: map 7% reduce 0% 17/06/13 16:07:59 INFO mapreduce.Job: map 8% reduce 0% 17/06/13 16:08:57 INFO mapreduce.Job: map 9% reduce 0% 17/06/13 16:10:07 INFO mapreduce.Job: map 10% reduce 0% 17/06/13 16:11:07 INFO mapreduce.Job: map 11% reduce 0% 17/06/13 16:12:07 INFO mapreduce.Job: map 12% reduce 0% 17/06/13 16:13:06 INFO mapreduce.Job: map 13% reduce 0% 17/06/13 16:14:07 INFO mapreduce.Job: map 14% reduce 0% 17/06/13 16:15:05 INFO mapreduce.Job: map 15% reduce 0% 17/06/13 16:16:06 INFO mapreduce.Job: map 16% reduce 0% 17/06/13 16:17:06 INFO mapreduce.Job: map 17% reduce 0% 17/06/13 16:18:04 INFO mapreduce.Job: map 18% reduce 0% 17/06/13 16:19:05 INFO mapreduce.Job: map 19% reduce 0% 17/06/13 16:20:05 INFO mapreduce.Job: map 20% reduce 0% 17/06/13 16:21:04 INFO mapreduce.Job: map 21% reduce 0% 17/06/13 16:22:04 INFO mapreduce.Job: map 22% reduce 0% 17/06/13 16:23:02 INFO mapreduce.Job: map 23% reduce 0% 17/06/13 16:24:02 INFO mapreduce.Job: map 24% reduce 0% 17/06/13 16:25:03 INFO mapreduce.Job: map 25% reduce 0% 17/06/13 16:26:05 INFO mapreduce.Job: map 26% reduce 0% 17/06/13 16:27:02 INFO mapreduce.Job: map 27% reduce 0% 17/06/13 16:28:01 INFO mapreduce.Job: map 28% reduce 0% 17/06/13 16:29:02 INFO mapreduce.Job: map 29% reduce 0% 17/06/13 16:30:01 INFO mapreduce.Job: map 30% reduce 0% 17/06/13 16:31:00 INFO mapreduce.Job: map 31% reduce 0% 17/06/13 16:32:00 INFO mapreduce.Job: map 32% reduce 0% 17/06/13 16:33:00 INFO mapreduce.Job: map 33% reduce 0% 17/06/13 16:34:00 INFO mapreduce.Job: map 34% reduce 0% 17/06/13 16:34:59 INFO mapreduce.Job: map 35% reduce 0% 17/06/13 16:35:58 INFO mapreduce.Job: map 36% reduce 0% 17/06/13 16:36:58 INFO mapreduce.Job: map 37% reduce 0% 17/06/13 16:37:58 INFO mapreduce.Job: map 38% reduce 0% 17/06/13 16:38:55 INFO mapreduce.Job: map 39% reduce 0% 17/06/13 16:39:55 INFO mapreduce.Job: map 40% reduce 0% 17/06/13 16:40:53 INFO mapreduce.Job: map 41% reduce 0% 17/06/13 16:41:54 INFO mapreduce.Job: map 42% reduce 0% 17/06/13 16:42:53 INFO mapreduce.Job: map 43% reduce 0% 17/06/13 16:43:51 INFO mapreduce.Job: map 44% reduce 0% 17/06/13 16:44:51 INFO mapreduce.Job: map 45% reduce 0% 17/06/13 16:45:51 INFO mapreduce.Job: map 46% reduce 0% 17/06/13 16:46:50 INFO mapreduce.Job: map 47% reduce 0% 17/06/13 16:47:50 INFO mapreduce.Job: map 48% reduce 0% 17/06/13 16:48:51 INFO mapreduce.Job: map 49% reduce 0% 17/06/13 16:49:50 INFO mapreduce.Job: map 50% reduce 0% 17/06/13 16:50:49 INFO mapreduce.Job: map 51% reduce 0% 17/06/13 16:51:50 INFO mapreduce.Job: map 52% reduce 0% 17/06/13 16:52:50 INFO mapreduce.Job: map 53% reduce 0% 17/06/13 16:53:50 INFO mapreduce.Job: map 54% reduce 0% 17/06/13 16:54:49 INFO mapreduce.Job: map 55% reduce 0% 17/06/13 16:55:48 INFO mapreduce.Job: map 56% reduce 0% 17/06/13 16:56:47 INFO mapreduce.Job: map 57% reduce 0% 17/06/13 16:57:46 INFO mapreduce.Job: map 58% reduce 0% 17/06/13 16:58:45 INFO mapreduce.Job: map 59% reduce 0% 17/06/13 16:59:51 INFO mapreduce.Job: map 60% reduce 0% 17/06/13 17:00:50 INFO mapreduce.Job: map 61% reduce 0% 17/06/13 17:01:50 INFO mapreduce.Job: map 62% reduce 0% 17/06/13 17:02:48 INFO mapreduce.Job: map 63% reduce 0% 17/06/13 17:03:49 INFO mapreduce.Job: map 64% reduce 0% 17/06/13 17:05:40 INFO mapreduce.Job: map 65% reduce 0% 17/06/13 17:06:41 INFO mapreduce.Job: map 66% reduce 0% 17/06/13 17:07:39 INFO mapreduce.Job: map 67% reduce 0% 17/06/13 17:08:39 INFO mapreduce.Job: map 68% reduce 0% 17/06/13 17:09:40 INFO mapreduce.Job: map 69% reduce 0% 17/06/13 17:10:39 INFO mapreduce.Job: map 70% reduce 0% 17/06/13 17:11:39 INFO mapreduce.Job: map 71% reduce 0% 17/06/13 17:12:37 INFO mapreduce.Job: map 72% reduce 0% 17/06/13 17:13:36 INFO mapreduce.Job: map 73% reduce 0% 17/06/13 17:14:34 INFO mapreduce.Job: map 74% reduce 0% 17/06/13 17:15:33 INFO mapreduce.Job: map 75% reduce 0% 17/06/13 17:16:34 INFO mapreduce.Job: map 76% reduce 0% 17/06/13 17:17:33 INFO mapreduce.Job: map 77% reduce 0% 17/06/13 17:18:31 INFO mapreduce.Job: map 78% reduce 0% 17/06/13 17:19:29 INFO mapreduce.Job: map 79% reduce 0% 17/06/13 17:20:30 INFO mapreduce.Job: map 80% reduce 0% 17/06/13 17:21:30 INFO mapreduce.Job: map 81% reduce 0% 17/06/13 17:22:30 INFO mapreduce.Job: map 82% reduce 0% 17/06/13 17:23:27 INFO mapreduce.Job: map 83% reduce 0% 17/06/13 17:24:27 INFO mapreduce.Job: map 84% reduce 0% 17/06/13 17:25:25 INFO mapreduce.Job: map 85% reduce 0% 17/06/13 17:26:25 INFO mapreduce.Job: map 86% reduce 0% 17/06/13 17:27:23 INFO mapreduce.Job: map 87% reduce 0% 17/06/13 17:28:23 INFO mapreduce.Job: map 88% reduce 0% 17/06/13 17:29:22 INFO mapreduce.Job: map 89% reduce 0% 17/06/13 17:30:20 INFO mapreduce.Job: map 90% reduce 0% 17/06/13 17:31:18 INFO mapreduce.Job: map 91% reduce 0% 17/06/13 17:32:18 INFO mapreduce.Job: map 92% reduce 0% 17/06/13 17:33:11 INFO mapreduce.Job: Task Id : attempt_1496319477519_0055_m_000019_0, Status : FAILED Error: java.io.IOException: File copy failed: hdfs://nn1:8020/dir/year=2017/month=05/day=13/file.gz --> hdfs://nn2:8020/dir/day=20170513/file.gz at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:287) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:255) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:52) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164) Caused by: java.io.IOException: Couldn't run retriable-command: Copying hdfs://nn1:8020/dir/year=2017/month=05/day=13/file.gz to hdfs://nn2:8020/dir/day=20170513/file.gz at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101) at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:283) ... 10 more Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-1426797840-IP.11-1461158403571:blk_1453383150_379676374; getBlockSize()=390144; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[IP.19:50010,DS-c7e86e87-90e0-4212-b07d-1ae507f6d4fa,DISK], DatanodeInfoWithStorage[IP.18:50010,DS-1579c3c5-a24a-4a52-a58a-d0037f043fcf,DISK], DatanodeInfoWithStorage[IP.21:50010,DS-597a8073-190d-4e46-9e0f-b234992dd5fc,DISK], DatanodeInfoWithStorage[IP.16:50010,DS-f834d014-458c-42e8-b156-8a8b37fcd6bb,DISK]]} at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getInputStream(RetriableFileCopyCommand.java:304) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:249) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToFile(RetriableFileCopyCommand.java:183) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:123) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:99) at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87) ... 11 more Caused by: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-1426797840-IP.11-1461158403571:blk_1453383150_379676374; getBlockSize()=390144; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[IP.19:50010,DS-c7e86e87-90e0-4212-b07d-1ae507f6d4fa,DISK], DatanodeInfoWithStorage[IP.18:50010,DS-1579c3c5-a24a-4a52-a58a-d0037f043fcf,DISK], DatanodeInfoWithStorage[IP.21:50010,DS-597a8073-190d-4e46-9e0f-b234992dd5fc,DISK], DatanodeInfoWithStorage[IP.16:50010,DS-f834d014-458c-42e8-b156-8a8b37fcd6bb,DISK]]} at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:434) at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:339) at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:274) at org.apache.hadoop.hdfs.DFSInputStream.(DFSInputStream.java:266) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1538) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:331) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:327) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getInputStream(RetriableFileCopyCommand.java:300) ... 16 more 17/06/13 17:33:12 INFO mapreduce.Job: map 89% reduce 0% 17/06/13 17:33:22 INFO mapreduce.Job: map 93% reduce 0% 17/06/13 17:33:27 INFO mapreduce.Job: Task Id : attempt_1496319477519_0055_m_000019_1, Status : FAILED Error: java.io.IOException: File copy failed: hdfs://nn1:8020/dir/year=2017/month=05/day=13/file.gz --> hdfs://nn2:8020/dir/day=20170513/file.gz at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:287) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:255) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:52) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164) Caused by: java.io.IOException: Couldn't run retriable-command: Copying hdfs://nn1:8020/dir/year=2017/month=05/day=13/file.gz to hdfs://nn2:8020/dir/day=20170513/file.gz at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101) at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:283) ... 10 more Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-1426797840-IP.11-1461158403571:blk_1453383150_379676374; getBlockSize()=390144; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[IP.19:50010,DS-c7e86e87-90e0-4212-b07d-1ae507f6d4fa,DISK], DatanodeInfoWithStorage[IP.16:50010,DS-f834d014-458c-42e8-b156-8a8b37fcd6bb,DISK], DatanodeInfoWithStorage[IP.21:50010,DS-597a8073-190d-4e46-9e0f-b234992dd5fc,DISK], DatanodeInfoWithStorage[IP.18:50010,DS-1579c3c5-a24a-4a52-a58a-d0037f043fcf,DISK]]} at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getInputStream(RetriableFileCopyCommand.java:304) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:249) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToFile(RetriableFileCopyCommand.java:183) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:123) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:99) at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87) ... 11 more Caused by: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-1426797840-IP.11-1461158403571:blk_1453383150_379676374; getBlockSize()=390144; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[IP.19:50010,DS-c7e86e87-90e0-4212-b07d-1ae507f6d4fa,DISK], DatanodeInfoWithStorage[IP.16:50010,DS-f834d014-458c-42e8-b156-8a8b37fcd6bb,DISK], DatanodeInfoWithStorage[IP.21:50010,DS-597a8073-190d-4e46-9e0f-b234992dd5fc,DISK], DatanodeInfoWithStorage[IP.18:50010,DS-1579c3c5-a24a-4a52-a58a-d0037f043fcf,DISK]]} at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:434) at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:339) at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:274) at org.apache.hadoop.hdfs.DFSInputStream.(DFSInputStream.java:266) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1538) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:331) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:327) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getInputStream(RetriableFileCopyCommand.java:300) ... 16 more 17/06/13 17:33:28 INFO mapreduce.Job: map 89% reduce 0% 17/06/13 17:33:38 INFO mapreduce.Job: map 93% reduce 0% 17/06/13 17:33:43 INFO mapreduce.Job: Task Id : attempt_1496319477519_0055_m_000019_2, Status : FAILED Error: java.io.IOException: File copy failed: hdfs://nn1:8020/dir/year=2017/month=05/day=13/file.gz --> hdfs://nn2:8020/dir/day=20170513/file.gz at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:287) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:255) at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:52) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164) Caused by: java.io.IOException: Couldn't run retriable-command: Copying hdfs://nn1:8020/dir/year=2017/month=05/day=13/file.gz to hdfs://nn2:8020/dir/day=20170513/file.gz at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101) at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:283) ... 10 more Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-1426797840-IP.11-1461158403571:blk_1453383150_379676374; getBlockSize()=390144; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[IP.18:50010,DS-1579c3c5-a24a-4a52-a58a-d0037f043fcf,DISK], DatanodeInfoWithStorage[IP.19:50010,DS-c7e86e87-90e0-4212-b07d-1ae507f6d4fa,DISK], DatanodeInfoWithStorage[IP.16:50010,DS-f834d014-458c-42e8-b156-8a8b37fcd6bb,DISK], DatanodeInfoWithStorage[IP.21:50010,DS-597a8073-190d-4e46-9e0f-b234992dd5fc,DISK]]} at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getInputStream(RetriableFileCopyCommand.java:304) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:249) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToFile(RetriableFileCopyCommand.java:183) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:123) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:99) at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87) ... 11 more Caused by: java.io.IOException: Cannot obtain block length for LocatedBlock{BP-1426797840-IP.11-1461158403571:blk_1453383150_379676374; getBlockSize()=390144; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[IP.18:50010,DS-1579c3c5-a24a-4a52-a58a-d0037f043fcf,DISK], DatanodeInfoWithStorage[IP.19:50010,DS-c7e86e87-90e0-4212-b07d-1ae507f6d4fa,DISK], DatanodeInfoWithStorage[IP.16:50010,DS-f834d014-458c-42e8-b156-8a8b37fcd6bb,DISK], DatanodeInfoWithStorage[IP.21:50010,DS-597a8073-190d-4e46-9e0f-b234992dd5fc,DISK]]} at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:434) at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:339) at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:274) at org.apache.hadoop.hdfs.DFSInputStream.(DFSInputStream.java:266) at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1538) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:331) at org.apache.hadoop.hdfs.DistributedFileSystem$4.doCall(DistributedFileSystem.java:327) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:327) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:786) at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.getInputStream(RetriableFileCopyCommand.java:300) ... 16 more Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 17/06/13 17:33:44 INFO mapreduce.Job: map 89% reduce 0% 17/06/13 17:33:53 INFO mapreduce.Job: map 93% reduce 0% 17/06/13 17:33:58 INFO mapreduce.Job: map 100% reduce 0% 17/06/13 17:34:00 INFO mapreduce.Job: Job job_1496319477519_0055 failed with state FAILED due to: Task failed task_1496319477519_0055_m_000019 Job failed as tasks failed. failedMaps:1 failedReduces:0 17/06/13 17:34:00 INFO mapreduce.Job: Counters: 35 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=155361 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=18135566 HDFS: Number of bytes written=18130700 HDFS: Number of read operations=179 HDFS: Number of large read operations=0 HDFS: Number of write operations=40 Job Counters Failed map tasks=4 Killed map tasks=23 Launched map tasks=28 Other local map tasks=28 Total time spent by all maps in occupied slots (ms)=129038654 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=129038654 Total vcore-milliseconds taken by all map tasks=129038654 Total megabyte-milliseconds taken by all map tasks=132135581696 Map-Reduce Framework Map input records=19 Map output records=0 Input split bytes=115 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=69 CPU time spent (ms)=4220 Physical memory (bytes) snapshot=342384640 Virtual memory (bytes) snapshot=5555425280 Total committed heap usage (bytes)=553648128 File Input Format Counters Bytes Read=4751 File Output Format Counters Bytes Written=0 org.apache.hadoop.tools.mapred.CopyMapper$Counter BYTESCOPIED=18130700 BYTESEXPECTED=18130700 COPY=19 17/06/13 17:34:00 ERROR tools.DistCp: Exception encountered java.io.IOException: DistCp failure: Job job_1496319477519_0055 has failed: Task failed task_1496319477519_0055_m_000019 Job failed as tasks failed. failedMaps:1 failedReduces:0 at org.apache.hadoop.tools.DistCp.waitForJobCompletion(DistCp.java:215) at org.apache.hadoop.tools.DistCp.execute(DistCp.java:158) at org.apache.hadoop.tools.DistCp.run(DistCp.java:128) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.tools.DistCp.main(DistCp.java:462)