Support Questions
Find answers, ask questions, and share your expertise

java.io.IOException: Retry attempted 12 times without completing, bailing out

java.io.IOException: Retry attempted 12 times without completing, bailing out

New Contributor

Hello everyone,

 

I implemented a Java Spark(2.4.7) application and I'm currently loading all the processed data into containerized Hbase(2.1). I wanted to load the data efficiently and therefore I'm trying to bulk load it to Hbase.

I'm using Hbase-connector and followed their example (lines 294-356).

 

I pre-split Hbase regions to 10 and transformed each key as follows:

a123 -> 0_a123 (hash(key) mod 10 = 0 concatenated as a prefix to the key)

 

I can see 10 Hfiles created in my project, but when I call LoadIncrementalHfiles.doBuldLoad(...) I get the following error:

 

[main] ERROR org.apache.hadoop.hbase.tool.LoadIncrementalHFiles - -------------------------------------------------
Bulk load aborted with some files not yet loaded:
-------------------------------------------------
...src/main/java/latro/hbase/hfiles/R/49a108ce7047482bbc3027c03422fd95
.../src/main/java/latro/hbase/hfiles/R/be7a38b8998947eeb634e7ba2d51de99
.../src/main/java/latro/hbase/hfiles/R/ae65c5cdd183490bb33e384fe1dc5345
.../src/main/java/latro/hbase/hfiles/R/62a021fc33ad4d5b9fb0cb0dca7193a9
.../src/main/java/latro/hbase/hfiles/R/91f1dfd7c9ef49c7acb41038fd0e1f2b
.../src/main/java/latro/hbase/hfiles/R/7248f5546b124b8db0f4f1c51b7cbbd8
.../src/main/java/latro/hbase/hfiles/R/c8a8ba7b205748bb871c5f649578598c
.../src/main/java/latro/hbase/hfiles/R/a0f65923d754411bb4dbe185b5ae3acf
.../src/main/java/latro/hbase/hfiles/R/47f6b869b7d545db8541247ae4e6e88e
.../src/main/java/latro/hbase/hfiles/R/5e11aa927f5d4ab88aabd44f01e8fff6

java.io.IOException: Retry attempted 12 times without completing, bailing out
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.performBulkLoad(LoadIncrementalHFiles.java:420)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:343)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java

 

My code:

 

    public void documentsBulkLoad(JavaRDD<DataItem> documentsRDD) throws IOException {
        hbaseContext.bulkLoad(documentsRDD, Context.tableName, new BulkLoadFunction(), Context.BULKLOAD_OUTPUT_PATH,
        new HashMap<byte[], FamilyHFileWriteOptions>(), false, HConstants.DEFAULT_MAX_FILE_SIZE);

        try {
            Connection conn = ConnectionFactory.createConnection(Context.config);
            Admin admin = conn.getAdmin();
            Table table = conn.getTable(HbaseDataContext.tableName);
            RegionLocator regionLocator = conn.getRegionLocator(Context.tableName);
            // Do bulk load
            LoadIncrementalHFiles load = new LoadIncrementalHFiles(Context.config);
            load.doBulkLoad(new Path(Context.BULKLOAD_OUTPUT_PATH), admin, table, regionLocator);
        } catch (TableNotFoundException ex){
            logger.error("Error, table: " + HbaseDataContext.tableName.toString() + " not found while trying to bulk load to DB");
            ex.printStackTrace();
        } catch (IOException ex){
           logger.error("Error, IO exception occured while trying to bulk load to DB");
           ex.printStackTrace();
        }
    }


    public static class BulkLoadFunction implements Function<DataItem, Pair<KeyFamilyQualifier, byte[]>> {

        private String toDBKey(String key){
            String keyPrefix = Integer.toString(Math.abs(key.hashCode()) % 10);
            return String.format("%s_%s", keyPrefix, key);
        }

        @Override
        public Pair<KeyFamilyQualifier, byte[]> call(DataItem dataItem) throws Exception {
            if (dataItem == null || dataItem.getDocument() == null){
                return null;
            }
            AbstractDocumentMap document = dataItem.getDocument();
            String dbKey = toDBKey(document.getSelf());

            KeyFamilyQualifier kfq = new KeyFamilyQualifier(Bytes.toBytes(dbKey), Context.FAMILY_BYTES, Context.QUALIFIER_BYTES);
            return new Pair<>(kfq, document.toBytes());
        }
    } 

 

 

I saw a similar post without a solution.

I really appreciate any help!

Thanks!