Deleting files older than specified time with s3cmd and bash

Update: Amazon has now made it so you can set expiration times on objects in S3, see more here:

Recently I was working on a project where we upload backups to Amazon S3.  I wanted to keep the files around for a certain duration and remove any files that were older than a month.  We use the s3cmd utility for most of our command line based calls to S3, however it doesn’t have a built in “delete any file in this bucket that is older than 30 days” function.  After googling around a bit, we found some python based scripts, however there wasn’t any that was a simple bash script that would do what I was looking for.  I whipped this one up real quick, it may not be the best looking but it gets the job done:

Posted In: Amazon AWS, Tips n' Tricks

Tags: , ,

  • Adam C

    Love it! Thanks!

  • Ciprian


    Could you please help me rewrite the script in order to make it delete files in a bucket which has 2 dirs pls ? It’s something like this: s3:\\my_bucket\some-dir-1

    The script works great on the buckets which have files in root dir. Thank you for that!

  • Matt Daum

    Ciprian –

    You could do something like ./deleteOld “mybucket/subdir-1” 30

    I’d recommend looking into the expirations for objects in S3 that Amazon has added since this post. You can find that information at the top of the link.

  • JProffitt71

    Hi there, I made this script for *nix users since the date command is different. It will filter out the directories and tell you what it’s deleting, and accepts any single unit (30d, 1m, 1y).

  • Nice update!

  • Pingback: Backup Docker to Amazon S3 by stefanXO()

  • OdooPlay

    how config for 120 days?
    thanks for the script.

  • Andrei C.

    Like if you use this in 2017 :)
    Thanks for the script.