Update: Amazon has now made it so you can set expiration times on objects in S3, see more here: https://forums.aws.amazon.com/ann.jspa?annID=1303

Recently I was working on a project where we upload backups to Amazon S3.  I wanted to keep the files around for a certain duration and remove any files that were older than a month.  We use the s3cmd utility for most of our command line based calls to S3, however it doesn’t have a built in “delete any file in this bucket that is older than 30 days” function.  After googling around a bit, we found some python based scripts, however there wasn’t any that was a simple bash script that would do what I was looking for.  I whipped this one up real quick, it may not be the best looking but it gets the job done:

  • Adam C

    Love it! Thanks!

  • Ciprian

    Hi

    Could you please help me rewrite the script in order to make it delete files in a bucket which has 2 dirs pls ? It’s something like this: s3:\\my_bucket\some-dir-1

    The script works great on the buckets which have files in root dir. Thank you for that!

  • Matt Daum

    Ciprian –

    You could do something like ./deleteOld “mybucket/subdir-1″ 30

    I’d recommend looking into the expirations for objects in S3 that Amazon has added since this post. You can find that information at the top of the link.

  • JProffitt71

    Hi there, I made this script for *nix users since the date command is different. It will filter out the directories and tell you what it’s deleting, and accepts any single unit (30d, 1m, 1y).

    https://gist.github.com/JProffitt71/9044744

  • http://www.setfive.com/ Matt Daum

    Nice update!

  • Pingback: Backup Docker to Amazon S3 by stefanXO