Support Questions

Find answers, ask questions, and share your expertise

Datalake Backup not working

avatar
Explorer

Hi folks,

My Datalake backup of CDP public always fails with this reason: Solr: Could not find any valid local directory for s3ablock-0001-.

 

Anyone knows how to fix?

1 ACCEPTED SOLUTION

avatar
Master Collaborator

This issue has been resolved by the support case, we applied the below solution:

 

1- Please check the /tmp directory if "/tmp/hadoop-solron" is present or not on the Data Lake master node
2- If not please create "/tmp/hadoop-solron" 
3- It should be owned by user/group solr:solr and have 755 permissions
4- Also, ensure there is enough available space in /tmp on the DL master.

View solution in original post

8 REPLIES 8

avatar
Master Collaborator

@pandav Welcome to Cloudera Community, I can understand you are facing issues while taking the Data lake backup.

 

Could you please provide the output of the following command:

# cdp datalake list-datalake-backups --datalake-name dl-name

# df -kha

 

what is your CDP runtime version, because we have seen this in the past whenever we tried to take Data lake back up, it will backup ranger-HMS-metadata and ranger-audit. The temp file will be written to the master node before moving to S3. The metadata is too big that fills up the root file system on the master node.

 

You can contact Cloudera support also on this.

 

 

Cheers!
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Explorer

I guess the problem is that the root file system master node is filled up. Where cdpcli stores temp file?

avatar
Master Collaborator

No, the cdpcli does not store any file in a temp location. it is the data lake backup that stores the temp file on the master node, you can try the following command:

 

 cdp datalake backup-datalake --datalake-name dl-bakup --backup-name test-backup --skip-ranger-hms-metadata --skip-atlas-metadata 
--skip-ranger-audits --backup-location s3a://bucket-name/backup-archive

 Please refer to the below official doc for data lake backup:

Configuring and running Data Lake backups 

 

Cheers!
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

avatar
Explorer

Ok, that sounds  good to me, but do you know the path for the temp file? 

thank you! 

avatar
Explorer

@shehbazk I deleted some files and restart backup but got the same problem,

`Backup failed, returning datalake to running state. Failure message: Solr: Could not find any valid local directory for s3ablock-0001-`
 
which is the path of temp file, on the master of datalake, where cdpcli store backup file temporarily?
in order to check free space in it.
 

avatar
Master Collaborator

@pandav Thanks for the update, I would request you to please file a support case ticket, We need to check multiple aspects for the backup.

 

 

avatar
Explorer

thanks! 

avatar
Master Collaborator

This issue has been resolved by the support case, we applied the below solution:

 

1- Please check the /tmp directory if "/tmp/hadoop-solron" is present or not on the Data Lake master node
2- If not please create "/tmp/hadoop-solron" 
3- It should be owned by user/group solr:solr and have 755 permissions
4- Also, ensure there is enough available space in /tmp on the DL master.