Support Questions
Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Problem While taking Backup from HDFS to Local File System ,


Hi all,

Im moving the HDFS data to Local File System for backup purpose, the capacity is 779GB when it reaches 769GB it failed with Warning.I used COPYTOLOCAL to transfer this data

I googled and found we can set the below property, kindly suggest whether it is recommend to avoid this problem.or kindly give the solution to fix this error.



####### LOG ##########

[hdfs@aps-hadoop5 FullBackup]$ hdfs dfs -copyToLocal hdfs://aps-hadoop2:8020/backup/hbase/FullBackup/20170523 /backup/hbase/FullBackup

17/05/24 20:31:53 WARN hdfs.BlockReaderFactory: BlockReaderFactory(fileName=/backup/hbase/FullBackup/20170523/Recipients/part-m-00000, block=BP-1810172115-hadoop2-1478343078462:blk_1080766518_7050974): I/O error requesting file descriptors. Disabling domain socket DomainSocket(fd=359,path=/var/lib/hadoop-hdfs/dn_socket) read(2) error: Resource temporarily unavailable at Method) at$000( at$ at at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed( at org.apache.hadoop.hdfs.BlockReaderFactory.requestFileDescriptors( at org.apache.hadoop.hdfs.BlockReaderFactory.createShortCircuitReplicaInfo( at org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.create( at org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.fetchOrCreate( at org.apache.hadoop.hdfs.BlockReaderFactory.getBlockReaderLocal( at at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo( at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy( at at at at at at$TargetFileSystem.writeStreamToFile( at at at at at at at at at at at at at at at at at at at at at at org.apache.hadoop.fs.FsShell.main(

17/05/24 20:31:53 WARN shortcircuit.ShortCircuitCache: ShortCircuitCache(0x62ec095e): failed to load 1080766518_BP-1810172115-hadoop2-1478343078462


@Mathi Murugan

Can you see if the file is corrupted or not in hdfs.

su - hdfs -c "hdfs fsck /backup/hbase/FullBackup/20170523/Recipients/part-m-00000"

Is this happening only on this file when you retry ?