Support Questions
Find answers, ask questions, and share your expertise

Running Hbase backup command from remote machine fails

Hi ,

Currently I am writing a python script which is in my local machine and I am calling a shell script which is residing in remote server. The Shell script has the Hbase Backup command to push backup to S3. Shell script works fine taking backup and pushing to s3 when executed in that remote machine. But when I call that script from my local machine through python it errors out.

Below is the command which I am running in my shell script.

 hbase backup create full s3a://$AccessKey:$SecretKey@$BucketPath -set $BackupSet

I get this below error

ERROR [main] util.AbstractHBaseTool: Error running command-line tool Failed of exporting snapshot snapshot_1521106075558_default_tablename to s3a://AccessKey:SecretKey@swe-backup-test/backup_1521106069410/default/tablename/ with reason code 1
\tat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
\tat sun.reflect.NativeConstructorAccessorImpl.newInstance(
\tat sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
\tat java.lang.reflect.Constructor.newInstance(
\tat org.apache.hadoop.ipc.RemoteException.instantiateException(
\tat org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(
\tat org.apache.hadoop.hbase.util.ForeignExceptionUtil.toIOException(
\tat org.apache.hadoop.hbase.client.HBaseAdmin$TableBackupFuture.convertResult(
\tat org.apache.hadoop.hbase.client.HBaseAdmin$TableBackupFuture.convertResult(
\tat org.apache.hadoop.hbase.client.HBaseAdmin$ProcedureFuture.waitProcedureResult(
\tat org.apache.hadoop.hbase.client.HBaseAdmin$ProcedureFuture.get(
\tat org.apache.hadoop.hbase.client.HBaseAdmin.get(
\tat org.apache.hadoop.hbase.client.HBaseAdmin.backupTables(
\tat org.apache.hadoop.hbase.client.HBaseBackupAdmin.backupTables(
\tat org.apache.hadoop.hbase.backup.impl.BackupCommands$CreateCommand.execute(
\tat org.apache.hadoop.hbase.backup.BackupDriver.parseAndRun(
\tat org.apache.hadoop.hbase.backup.BackupDriver.doWork(
\tat org.apache.hadoop.hbase.backup.BackupDriver.main(
Caused by: org.apache.hadoop.ipc.RemoteException( Failed of exporting snapshot snapshot_1521106075558_default_US_tablename to s3a://Accesskey:SecretKey@swe-backup-test/backup_1521106069410/default/tablename/ with reason code 1
\tat org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.snapshotCopy(
\tat org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.executeFromState(
\tat org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.executeFromState(
\tat org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(
\tat org.apache.hadoop.hbase.procedure2.Procedure.doExecute(
\tat org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(
\tat org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(
\tat org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(
\tat org.apache.hadoop.hbase.procedure2.ProcedureExecutor.access$200(
\tat org.apache.hadoop.hbase.procedure2.ProcedureExecutor$

After some research I found the below link which has the same error which I am receiving when I call my python from my local machine.

In that link , they say /user/hbase directory should be present in hdfs and should be writable. I already have that directory present in my hdfs and also i have made it writable. But still facing the same error.

can anybody please help me out to resolve this issue. It is urgent. @Jay Kumar SenSharma



Can you please post snapshot of your logs?

Expert Contributor

Hi @Swetha Nelwad,

I think you are using few parameters in shell script. Can you check whether parameters are translating into values?

At first, you can hardcode the values and simply call script to see whether it works.

hbase backup create full s3a://$AccessKey:$SecretKey@$BucketPath -set $BackupSet


Can I add that putting secrets in your s3a:// path is dangerous as it will end up in hadoop logs across the cluster

Best: put them in a JCEKs file in HDFS or other secure keystore

Good: have some options in the hadoop/hbase configurations

Weak: setting them on the command line with -D options (visible with a ps command)