Support Questions
Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Innovation Accelerator group hub.

Hbase backup to s3 not working

I am trying to use the backup utility of hbase to backup hbase tables to S3. But i am facing the following error despite giving the correct access key and secret key.

hbase backup create full s3a://$ACCESS_KEY:$SECRET_KEY@hbase-bucket/tables -set systems

Error is

 ERROR [main] util.AbstractHBaseTool: Error running command-line tool
java.nio.file.AccessDeniedException: s3a://xxxxxxxxxxxxxxxxxxxxxxxxxxxxx+g@hbase-bucket/tables: getFileStatus on s3a://xxxxxxxxxxxxxxxxxxxxxxxxxxx+g@hbase-bucket/tables: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 99EC757749F1D141), S3 Extended Request ID: gTGozzH9nuMdbfqfmMwrkPh17iuacp0CXQZ3jzcaYNnvdzxgExXQjzxZOrDG+RT0y/ArKI2QOfU

But i am able to use the command without any error

hadoop fs -ls s3a://hbase-bucket/

i have set fs.s3a.access.key and fs.s3a.secret.key in core-site.xml

any thoughts, i have given full permissions to the bucket


New Contributor


My colleague and I are working through backing up HBase tables to S3 at this very moment. There are various ways to get around the authentication problems, but perhaps the most straightforward method is to set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables like this:

export AWS_ACCESS_KEY_ID=yourAccessKeyId
export AWS_SECRET_ACCESS_KEY=yourSecretAccessKey

Now we're trying to figure out how/where we're supposed to set the hbase.backup.enabled property. This is perhaps more challenging than it should be because we are still working through a lot of simultaneous learning curves, including the one for HBase.

Super Collaborator

hbase.backup.enabled should be set to true in hbase-site.xml which is deployed on the cluster.