Created a "test" table in HBase to test incremetal backup feature on HDP.
hbase(main):002:0> create 'test', 'cf' 0 row(s) in 1.4690 seconds hbase(main):003:0> put 'test', 'row1', 'cf:a', 'value1' 0 row(s) in 0.1480 seconds hbase(main):004:0> put 'test', 'row2', 'cf:b', 'value2' 0 row(s) in 0.0070 seconds hbase(main):005:0> put 'test', 'row3', 'cf:c', 'value3' 0 row(s) in 0.0120 seconds hbase(main):006:0> put 'test', 'row3', 'cf:c', 'value4' 0 row(s) in 0.0070 seconds hbase(main):010:0> scan 'test' ROW COLUMN+CELL row1 column=cf:a, timestamp=1317945279379, value=value1 row2 column=cf:b, timestamp=1317945285731, value=value2 row3 column=cf:c, timestamp=1317945301466, value=value4 3 row(s) in 0.0250 seconds
Now i have taken a full backup using below in it's success
hbase backup create full hdfs://22.214.171.124:8020/tmp/full test
Now I want to test the "incremetal" backup on the above "test" table. So what I did :
put 'test', 'row123', 'cf:a', 'newValue'
Now when I am doing the below, it' getting falied
hbase backup create incremental hdfs://126.96.36.199:8020/tmp/full
Backup session finished. Status: FAILURE 2017-06-14 09:52:58,853 ERROR [main] util.AbstractHBaseTool: Error running command-line tool org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException): at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.cleanupTargetDir(FullTableBackupProcedure.java:205) at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.failBackup(FullTableBackupProcedure.java:279) at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:164) at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:54) at org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(StateMachineProcedure.java:107) at org.apache.hadoop.hbase.procedure2.Procedure.doExecute(Procedure.java:443) at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(ProcedureExecutor.java:934) at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:736) at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:689) at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.access$200(ProcedureExecutor.java:73) at org.apache.hadoop.hbase.procedure2.ProcedureExecutor$1.run(ProcedureExecutor.java:416)
In this below link it's mentione as "Backups and restores should be run as the hbase superuser (which is called “hbase” by default). What does it mean ? I am simply running the above back commands from a simple user with root access. Please suggest.
I tried to change the permission for hdfs files (tmp/full), but no use.
"In this below link it's mentione as "Backups and restores should be run as the hbase superuser (which is called “hbase” by default). What does it mean ?"
It is self-explanatory, perhaps you do not understand how these permissions work. HDFS access permissions are not the same as Linux user permissions. The Linux "root" user typically does not have superuser access. HBase superuser are defined in hbase-site.xml, but default, on HDP, to "hbase".
If you are not using Kerberos, switch to the HBase user (e.g. 'su - hbase'). If you are using Kerberos, kinit as the principal running hbase (e.g. `kinit hbase@REALM`).
Thanks am able to take incremental backup.
But issue with restore..
hbase restore hdfs://188.8.131.52:8020/tmp/full backup_1234567789 test1 -overwrite
java.lang.IlegalStateExcetion : Cannot restore hbase table
also what i found aftre doing restore , the 'test1' table in hbase got empty. So, the restoring from hdfs://184.108.40.206:8020/tmp/full backup_1234567789 is not got overwrite into 'test1' table. Does it mean there is no content in my backup ?
Thanks, so how to check whether my backup is empty or not ? du -s -h on /tmp/full showing some size for hdfs backup directory. What is the proper way to check