Support Questions
Find answers, ask questions, and share your expertise

Problem While taking Backup from HDFS to Local File System ,

Problem While taking Backup from HDFS to Local File System ,

Contributor

Hi all,

Im moving the HDFS data to Local File System for backup purpose, the capacity is 779GB when it reaches 769GB it failed with Warning.I used COPYTOLOCAL to transfer this data

I googled and found we can set the below property, kindly suggest whether it is recommend to avoid this problem.or kindly give the solution to fix this error.

<property>
<name>dfs.datanode.socket.write.timeout</name>
  <value>3000000</value>
</property>

<property>
  <name>dfs.client.socket-timeout</name>
  <value>3000000</value>
</property>

####### LOG ##########

[hdfs@aps-hadoop5 FullBackup]$ hdfs dfs -copyToLocal hdfs://aps-hadoop2:8020/backup/hbase/FullBackup/20170523 /backup/hbase/FullBackup

17/05/24 20:31:53 WARN hdfs.BlockReaderFactory: BlockReaderFactory(fileName=/backup/hbase/FullBackup/20170523/Recipients/part-m-00000, block=BP-1810172115-hadoop2-1478343078462:blk_1080766518_7050974): I/O error requesting file descriptors. Disabling domain socket DomainSocket(fd=359,path=/var/lib/hadoop-hdfs/dn_socket)

java.net.SocketTimeoutException: read(2) error: Resource temporarily unavailable at org.apache.hadoop.net.unix.DomainSocket.readArray0(Native Method) at org.apache.hadoop.net.unix.DomainSocket.access$000(DomainSocket.java:45) at org.apache.hadoop.net.unix.DomainSocket$DomainInputStream.read(DomainSocket.java:532) at java.io.FilterInputStream.read(FilterInputStream.java:83) at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2291) at org.apache.hadoop.hdfs.BlockReaderFactory.requestFileDescriptors(BlockReaderFactory.java:539) at org.apache.hadoop.hdfs.BlockReaderFactory.createShortCircuitReplicaInfo(BlockReaderFactory.java:488) at org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.create(ShortCircuitCache.java:784) at org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.fetchOrCreate(ShortCircuitCache.java:718) at org.apache.hadoop.hdfs.BlockReaderFactory.getBlockReaderLocal(BlockReaderFactory.java:422) at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:333) at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:662) at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:898) at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:955) at java.io.DataInputStream.read(DataInputStream.java:100) at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:91) at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59) at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119) at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:467) at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:392) at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:329) at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:264) at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:249) at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317) at org.apache.hadoop.fs.shell.Command.recursePath(Command.java:373) at org.apache.hadoop.fs.shell.CommandWithDestination.recursePath(CommandWithDestination.java:292) at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:319) at org.apache.hadoop.fs.shell.Command.recursePath(Command.java:373) at org.apache.hadoop.fs.shell.CommandWithDestination.recursePath(CommandWithDestination.java:292) at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:319) at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:289) at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:244) at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271) at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255) at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:221) at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:201) at org.apache.hadoop.fs.shell.Command.run(Command.java:165) at org.apache.hadoop.fs.FsShell.run(FsShell.java:297) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.fs.FsShell.main(FsShell.java:350)

17/05/24 20:31:53 WARN shortcircuit.ShortCircuitCache: ShortCircuitCache(0x62ec095e): failed to load 1080766518_BP-1810172115-hadoop2-1478343078462

1 REPLY 1

Re: Problem While taking Backup from HDFS to Local File System ,

Contributor
@Mathi Murugan

Can you see if the file is corrupted or not in hdfs.

su - hdfs -c "hdfs fsck /backup/hbase/FullBackup/20170523/Recipients/part-m-00000"

Is this happening only on this file when you retry ?