I have to delete multiple files after 7 days regularly. And the deletion dates and location are different for each file.Yes, I can apply a cronjob for each folder separately but tat will involve many cronjobs (atleast 15). In order to avoid this, I want to create a script which will go to each folder and delete the data. For example:
-rw-r--r-- 1 csbackup other 20223605295 Jun 12 06:40 IO.tgz
As you can see IO.tgz was created on 12/06/2015 6:40… now I want to delete this file at 17/06/2015 00:00 hours… this is one reason I’m unable to use mtime as it will delete exactly after 7*24 hrs.
I was thinking to compare the timestamps of the file however, stat utility is not present on my machine. And its now even allowing me to install it.
Can anyone please guide me via a script which I can use to delete after n days
Advertisement
Answer
You can make a list of directories you want to search in a file.
# cat file /data /d01 /u01/files/
Now you can use for
loop to remove the files which are there on those directories one by one.
for dir in $(cat file); do find $dir -type f -mtime 7 |xargs rm -f done