Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to delete log folder faster having files lots of file and heavy log folder

avatar
Contributor

Problem : Deletion of log folder with below details

Log size is around 10gb

No . of files around 9k files.

Simple rm -rf log folder doesn't works well.

1 ACCEPTED SOLUTION

avatar
Super Collaborator

@vpemawat

If you are not using log4j: If you are looking to delete the files for good then there is not many options available other than rm -rf; however there are few tweaks that you can do to make it faster

  1. you can perhaps run multiple rm scripts in parallel (multiple threads)
  2. In order to do this, you should be able to logically separate the log files either by folder or name format
  3. Once you have done that, you can run multiple rm commands in background like something below
nohup rm -fr app1-2016* > /tmp/nohup.out 2>&1 &
nohup rm -fr app1-2015* > /tmp/nohup.out 2>&1 &

If using log4j: You should probably be 'DailyRollingFileAppender" with 'maxBackupIndex' - this will essentially limit the max file size of your log and then purge the older contents. More details here: http://www.codeproject.com/Articles/81462/DailyRollingFileAppender-with-maxBackupIndex

Outside of this, you should consider the below 2 things for future use cases

  1. Organize the logs by folder (normally broken down like /logs/appname/yyyy/mm/dd/hh/<log files>
  2. Have a mechanism that will either delete the old log files, or archive it to a different log archive server

Hopefully this helps. If it does, please 'accept' and 'upvote' the answer. Thank you!!

View solution in original post

1 REPLY 1

avatar
Super Collaborator

@vpemawat

If you are not using log4j: If you are looking to delete the files for good then there is not many options available other than rm -rf; however there are few tweaks that you can do to make it faster

  1. you can perhaps run multiple rm scripts in parallel (multiple threads)
  2. In order to do this, you should be able to logically separate the log files either by folder or name format
  3. Once you have done that, you can run multiple rm commands in background like something below
nohup rm -fr app1-2016* > /tmp/nohup.out 2>&1 &
nohup rm -fr app1-2015* > /tmp/nohup.out 2>&1 &

If using log4j: You should probably be 'DailyRollingFileAppender" with 'maxBackupIndex' - this will essentially limit the max file size of your log and then purge the older contents. More details here: http://www.codeproject.com/Articles/81462/DailyRollingFileAppender-with-maxBackupIndex

Outside of this, you should consider the below 2 things for future use cases

  1. Organize the logs by folder (normally broken down like /logs/appname/yyyy/mm/dd/hh/<log files>
  2. Have a mechanism that will either delete the old log files, or archive it to a different log archive server

Hopefully this helps. If it does, please 'accept' and 'upvote' the answer. Thank you!!