Support Questions

Find answers, ask questions, and share your expertise

HBase Incremental Backup failed on HDP

New Contributor

Created a "test" table in HBase to test incremetal backup feature on HDP.

    hbase(main):002:0> create 'test', 'cf'
    0 row(s) in 1.4690 seconds

    hbase(main):003:0> put 'test', 'row1', 'cf:a', 'value1'
    0 row(s) in 0.1480 seconds

    hbase(main):004:0> put 'test', 'row2', 'cf:b', 'value2'
    0 row(s) in 0.0070 seconds

    hbase(main):005:0> put 'test', 'row3', 'cf:c', 'value3'
    0 row(s) in 0.0120 seconds

    hbase(main):006:0> put 'test', 'row3', 'cf:c', 'value4'
    0 row(s) in 0.0070 seconds

    hbase(main):010:0> scan 'test'       
    ROW                   COLUMN+CELL                                               
    row1                 column=cf:a, timestamp=1317945279379, value=value1        
    row2                 column=cf:b, timestamp=1317945285731, value=value2        
    row3                 column=cf:c, timestamp=1317945301466, value=value4        
    3 row(s) in 0.0250 seconds

Now i have taken a full backup using below in it's success

hbase backup create full hdfs://12.3.4.56:8020/tmp/full test 

Now I want to test the "incremetal" backup on the above "test" table. So what I did :

put 'test', 'row123', 'cf:a', 'newValue'

Now when I am doing the below, it' getting falied

hbase backup create incremental hdfs://12.3.4.56:8020/tmp/full

Error:

Backup session finished. Status: FAILURE
2017-06-14 09:52:58,853 ERROR [main] util.AbstractHBaseTool: Error running command-line tool
org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException):
        at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.cleanupTargetDir(FullTableBackupProcedure.java:205)
        at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.failBackup(FullTableBackupProcedure.java:279)
        at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:164)
        at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:54)
        at org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(StateMachineProcedure.java:107)
        at org.apache.hadoop.hbase.procedure2.Procedure.doExecute(Procedure.java:443)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(ProcedureExecutor.java:934)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:736)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:689)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.access$200(ProcedureExecutor.java:73)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor$1.run(ProcedureExecutor.java:416)

Updated:

In this below link it's mentione as "Backups and restores should be run as the hbase superuser (which is called “hbase” by default). What does it mean ? I am simply running the above back commands from a simple user with root access. Please suggest.

https://hortonworks.com/blog/coming-hdp-2-5-incremental-backup-restore-apache-hbase-apache-phoenix/

I tried to change the permission for hdfs files (tmp/full), but no use.

5 REPLIES 5

"In this below link it's mentione as "Backups and restores should be run as the hbase superuser (which is called “hbase” by default). What does it mean ?"

It is self-explanatory, perhaps you do not understand how these permissions work. HDFS access permissions are not the same as Linux user permissions. The Linux "root" user typically does not have superuser access. HBase superuser are defined in hbase-site.xml, but default, on HDP, to "hbase".

If you are not using Kerberos, switch to the HBase user (e.g. 'su - hbase'). If you are using Kerberos, kinit as the principal running hbase (e.g. `kinit hbase@REALM`).

New Contributor

Thanks am able to take incremental backup.

But issue with restore..

hbase restore hdfs://12.3.4.56:8020/tmp/full backup_1234567789 test1 -overwrite

java.lang.IlegalStateExcetion : Cannot restore hbase table

New Contributor

also what i found aftre doing restore , the 'test1' table in hbase got empty. So, the restoring from hdfs://12.3.4.56:8020/tmp/full backup_1234567789 is not got overwrite into 'test1' table. Does it mean there is no content in my backup ?

If the restore succeeded and the table you restored was empty after, it sounds like your backup was empty.

New Contributor

Thanks, so how to check whether my backup is empty or not ? du -s -h on /tmp/full showing some size for hdfs backup directory. What is the proper way to check