New Contributor
Posts: 1
Registered: ‎11-05-2013

FileNotFoundException: Path is not a file when reading a directory containing files on HDFS

Hi folks,


We have our own recursive FileInputFormat that extends org.apache.hadoop.mapreduce.lib.input.FileInputFormat. Basically, we call FileInputFormat.getSplits to get all the FileSplits of the input file path; if FileSplit returned is a directory, we expand the directory, in this way we flatten all directories(if any) in the input file path.


This works well in HDP 1.3, Apach Hadoop 0.20.2,  and 1.2 . However, we find that FileInputFormat.getSplits throws "FileNotFoundException: Path is not a file" on CDH 4.3.  This looks like a backwards compatibility issue. Is this behavior change expected or is it a bug?


Here is the stack trace ( note that /user/builder/ is the input file path, under which we have direcotry "dir1" that contains files): Path is not a file: /user/builder/
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$
at org.apache.hadoop.ipc.RPC$
at org.apache.hadoop.ipc.Server$Handler$
at org.apache.hadoop.ipc.Server$Handler$
at Method)
at org.apache.hadoop.ipc.Server$
stderr/out from shell cmd:
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
at java.lang.reflect.Constructor.newInstance(
at org.apache.hadoop.ipc.RemoteException.instantiateException(
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(
at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(
at org.apache.hadoop.hdfs.DFSClient.getBlockLocations(
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileBlockLocations(
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileBlockLocations(
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(




Posts: 1,903
Kudos: 435
Solutions: 307
Registered: ‎07-31-2013

Re: FileNotFoundException: Path is not a file when reading a directory containing files on HDFS

Its hard to say what is triggering this (without seeing your custom code), but calling getFileBlockLocations on an directory inode would yield that exception for certain. Perhaps something's causing your listing generator function (listStatus override typically) to now pass directories accidentally into the files list?
New Contributor
Posts: 2
Registered: ‎07-05-2014

Re: FileNotFoundException: Path is not a file when reading a directory containing files on HDFS

I'm aso getting a same type of error.

Initially the program is working fine with Now I removed and trying to run the same. Give me some inputs on this.