Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hadoop File Not Found Exception for HBase file.

Highlighted

Hadoop File Not Found Exception for HBase file.

New Contributor

Hi,

While calculating the size of HBase table I am getting file not found exception. Below is the code and exception details.

========================================================================================

public Long calculateHbaseTableSize(FileSystem fileSystem, DataSource dataSource, String tableName) throws IllegalArgumentException, IOException {

......

.....

ContentSummary cs = fileSystem .getContentSummary(new Path(Cache.getProperty(CacheConstants.HBASE_TABLE_STORAGE_PATH) + tableName)); ====> Line 494

spaceConsumed = cs.getSpaceConsumed();

if (dataSource.getSecondaryKeys() != null && dataSource.getSecondaryKeys().size() > 0) {

ContentSummary secondaryIndexSize = fileSystem.getContentSummary( new Path(Cache.getProperty(CacheConstants.HBASE_TABLE_STORAGE_PATH) + tableName + "_INDEX")); spaceConsumed = spaceConsumed + secondaryIndexSize.getSpaceConsumed();

}

}

==============================================================================================

Exception details:

17/08/08 17:13:57 WARN SparkETLDataInjection: APP_EXECUTIONjava.io.FileNotFoundException: wasb://hbasemosaicprod-2017-04-26t12-52-30-908z@landtstorageaccount1.blob.core.windows.net/hbase/data/default/TEST12_2277594: No such file or directory.
java.io.FileNotFoundException: wasb://hbasemosaicprod-2017-04-26t12-52-30-908z@landtstorageaccount1.blob.core.windows.net/hbase/data/default/TEST12_2277594: No such file or directory.
	at org.apache.hadoop.fs.azure.NativeAzureFileSystem.getFileStatus(NativeAzureFileSystem.java:2091)
	at org.apache.hadoop.fs.FileSystem.getContentSummary(FileSystem.java:1486)
	at com.augmentiq.maxiq.essential.sparkEtl.SparkETLRun.calculateHbaseTableSize(SparkETLRun.java:494)

File exists when I browse with hadoop fs -ls <Filename>

===============================================

maxiq@hn0-mosaic:~$ hadoop fs -ls wasb://hbasemosaicprod-2017-04-26t12-52-30-908z@landtstorageaccount1.blob.core.windows.net/hbase/data/default/TEST12_2277594

Found 3 items

drwxr-xr-x - hbase supergroup 0 2017-08-08 17:12 wasb://hbasemosaicprod-2017-04-26t12-52-30-908z@landtstorageaccount1.blob.core.windows.net/hbase/data/default/TEST12_2277594/.tabledesc

drwxr-xr-x - hbase supergroup 0 2017-08-08 17:12 wasb://hbasemosaicprod-2017-04-26t12-52-30-908z@landtstorageaccount1.blob.core.windows.net/hbase/data/default/TEST12_2277594/.tmp

drwxr-xr-x - hbase supergroup 0 2017-08-08 17:12 wasb://hbasemosaicprod-2017-04-26t12-52-30-908z@landtstorageaccount1.blob.core.windows.net/hbase/data/default/TEST12_2277594/a32161d48ee68d02c7bdfbe562492a8f

Can anyone please help me to understand why is this happening? Please let me know if I am missing something or going wrong anywhere ?

2 REPLIES 2

Re: Hadoop File Not Found Exception for HBase file.

New Contributor

@Mukund Tripathi I am not sure about above error. But if you want to check size of your table you can execute "hdfs dfs -du -h /hbase/data/default/TEST12_2277594" . This should give you size of HBase table.

Regards,

Fahim

Re: Hadoop File Not Found Exception for HBase file.

New Contributor

@Mohammedfahim Pathan I am concerned about why I am getting above exception as my flow is not getting completed because of that exception. Thanks

Don't have an account?
Coming from Hortonworks? Activate your account here