Member since
01-11-2019
1
Post
0
Kudos Received
0
Solutions
01-11-2019
06:22 AM
Hi,
I have a cluster hdp 2.6.4 with hdfs in HA, HBase master and standby integrated Active Directory on Ranger.
In URL to configure backup hbase
https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_data-access/content/ch_hbase_bar.html
says to edit the file /etc/hadoop/conf/container-executor.cfg by registering the variable
allowed.system.users=hbase, but when restarting the yarn service
returns to previous values, how to persist change on configuration?
For the backups I created and assigned the permissions to hbase. I running "hbase backup create full hdfs://pchquit01dclhdf01/backups/hbase/ns_hdocument_tl NS_HDOCUMENT_TL:TAB_HDOCUMENT" getting messag SUCCESS and backup file have 1.6 TB, but, at running incremental backup "hbase backup create incremental hdfs://pchquit01dclhdf01/backups/hbase/ns_hdocument_tl" failed with message:
2019-01-10 18:37:58,273 INFO [main] util.BackupClientUtil: Using existing backup root dir: hdfs://pchquit01dclhdf01/backups/hbase/ns_hdocument_tl
Backup session finished. Status: FAILURE
2019-01-10 18:38:00,760 ERROR [main] util.AbstractHBaseTool: Error running command-line tool
org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException):
at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.cleanupTargetDir(FullTableBackupProcedure.java:205)
at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.failBackup(FullTableBackupProcedure.java:279)
at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:294)
at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:71)
at org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(StateMachineProcedure.java:107)
at org.apache.hadoop.hbase.procedure2.Procedure.doExecute(Procedure.java:500)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(ProcedureExecutor.java:1086)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:888)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:841)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.access$200(ProcedureExecutor.java:77)
at org.apache.hadoop.hbase.procedure2.ProcedureExecutor$1.run(ProcedureExecutor.java:443)
At runnin backup history you see the error message in the incremental backup that "Can not convert from directory"
ID : backup_1547157122634
Type : INCREMENTAL
Tables : NS_HDOCUMENT_TL:TAB_HDOCUMENT
State : FAILED
Start time : Thu Jan 10 16:52:02 ECT 2019
Failed message : Can not convert from directory hdfs://pchquit01dclhdf01/apps/hbase/data/WALs/pchquit01dhdn01.fj.local,16020,1547143571009/pchquit01dhdn01.fj.local%2C16020%2C1547143571009.default.1547154444917;hdfs://pchquit01dclhdf01/apps/hbase/data/WALs/pchquit01dhdn03.fj.local,16020,1547143577365/pchquit01dhdn03.fj.local%2C16020%2C1547143577365.default.1547154444926 ..................................................................
Any help please?
Thank you in advance.
... View more
Labels: