How-To Geek

Delete Files Older Than x Days on Linux

The find utility on linux allows you to pass in a bunch of interesting arguments, including one to execute another command on each file. We’ll use this in order to figure out what files are older than a certain number of days, and then use the rm command to delete them.

Command Syntax

find /path/to/files* -mtime +5 -exec rm {} \;

Note that there are spaces between rm, {}, and \;


  • The first argument is the path to the files. This can be a path, a directory, or a wildcard as in the example above. I would recommend using the full path, and make sure that you run the command without the exec rm to make sure you are getting the right results.
  • The second argument, -mtime, is used to specify the number of days old that the file is. If you enter +5, it will find files older than 5 days.
  • The third argument, -exec, allows you to pass in a command such as rm. The {} \; at the end is required to end the command.

This should work on Ubuntu, Suse, Redhat, or pretty much any version of linux.

Want to Learn More? Get Linux Training from the Linux Foundation

Learn everything you could want to know about Linux with comprehensive online Linux courses and certification tests that include the real-world experience and first-hand knowledge of what it takes to be a successful Linux developer or system administration professional, all from a web browser in the comfort of your home.

The Linux Foundation is the nonprofit consortium dedicated to fostering the growth of Linux. Founded in 2000, they are supported by the leading technology companies and developers from around the world.

See All Linux Foundation Courses

Lowell Heddings, better known online as the How-To Geek, spends all his free time bringing you fresh geekery on a daily basis. You can follow him on if you'd like.

  • Published 02/21/07

Comments (81)

  1. Ryan

    How about just using the -delete flag rather than the -exec ?

  2. Geo

    “How about just using the -delete flag rather than the -exec ? ”

    Because if you have MANY files you will get maximum arguments exceeded..

    so instead by doing exec you fork one process of rm per result… hope it helps

  3. milkmood

    GOOD STUFF. Dead simple, and prevented me from having to try and write a complex script that would take me days. Implemented this in crontab in 5 minutes.


  4. milkmood

    Now, can I pipe the results to a log file in the same line?

  5. Takoro

    >>Now, can I pipe the results to a log file in the same line?


    bash -c ‘date;find /path/to/files* -mtime +5 -exec rm -v {} \;;date’ > ~/mylog

    This way, you’ll get a line at top with the date the line is executed (also a line at the bottom).

    The last two semicolons are necessary, one is for find and one for bash.

  6. pierissimo

    there is an argument for find,so that also directory will be removed?

  7. Artem Russakovskii

    Pierissimo, well, the find command doesn’t care if it’s a file or a directory, so it’ll delete both.

    The above article mentions it “This can be a path, a directory, or a wildcard as in the example above.” but in order to force delete the dirs, you need to do ‘rm -fr’ instead of just ‘rm’. -f will force deletion and -r will do it recursively on all subdirs.

    find can accept a -type argument where ‘find -type f’ will only find files, ‘find -type d’ will find dirs, and ‘find -type l’ will find links.

  8. Train

    Outstanding. Thanks for the help!!! This worked wonderfully!!!

  9. fabry

    this is great.. I have used this like this:
    find /tmp/ -type f -mmin 30 -exec rm {} \;

    to remove files that have been modified from 30 minutes ago.

    mtime remove files n*24 hours old, mmin n minutes old (where n is the number that you pass after the parameter mmin or mtime)

    again, thank you. your hint saved me a bunch of time.

  10. fischer

    This seems to need user input to delete the files, what is the switch to have it delete automatically?

  11. e_sandrs

    As mentioned, adding -f forces the delete.

    find /path/to/files* -mtime +5 -exec rm -f {} \;

  12. fischer

    But when I run it, it still asks if I want to delete and I must type in a “y”

  13. Bob

    @fischer: The behavior of your rm command is unusual. Normally it will quietly delete unless you tell it otherwise with the “-i” option. There often is a shell alias defined in .bashrc or somewhere similar that replaces “rm” automatically with “rm -i”. It is possible that it will even replace “rm -f” for you. Overall it depends on the shell you use and if your distro has fiddled with the rm command. The behavior of your rm command is unusual.

  14. Ed

    I had an issue like that with the cp command. Ended up that a previous admin had aliased the cp command to ‘cp -i’

  15. Steve

    The simplest general solution to avoid alias issues in scripts, at jobs, etc is to use the full path to the program unless it is a shell primitive, e.g. /bin/rm instead of rm. This also protects you from certain types of malicious trickery.

  16. Karsten

    works like a charm!

    I changed it to allow moving instead of deleting

    find /path/test-source/* -mtime +30 -exec mv {} /path/test-mv-to/ \;

    Tested without using ‘-mtime +30’ option
    Feedback is welcome :-)

    Thanks again!!

  17. Melinda

    This post helped me tremendously; it didn’t provide my solution, but got me thinking in the right direction.

    I had fumble-fingered some keys in vi, and accidentally saved a file to “^?” which showed up under ls -la as a file with no name! I tried to isolate it by timestamp, but it was easier to use file size (which happened to be unique).

    Ultimately, this is the command that allowed me to recover the file:

    find /path/to/files/* -size 4246c -exec mv {} \;

    I’ll have to remember the find command! Quite powerful! :)

  18. Pony


    Im using a .php script to automatically perform full cPanel backups via cron job. It works quite nicely. However, space will become an issue real quick with these daily backups dumping into my root. Im not using the ftp option to move them elsewhere at this time.

    I would like to set up a cron job in cPanel to automatically delete all cPanel backups that are 3 days old or older.

    Is this the correct command to do so or am I completely off base? If so, what is the correct command to work on cPanel 11 over Linux?

    find /home/user/backup-* -mtime +3 -exec rm {}\;

    The typical full backup reads like this /home/user/backup-2.4.2009_18-11-01_user.tar.gz Im assuming the backup-* wildcard is correct usage as well.

    I would greatly appreciate it someone could point me in the write direction.


  19. Melinda


    I wouldn’t consider myself an expert (since I just learned about this command recently), but it looks like the command you’ve constructed should do what you want.

    You could always create a test directory with some dummy files in it, and see if it works.

    Best wishes!

  20. Vince-0

    Frekin aye!

    Saved me some time – WHOOT for HowToGeek

  21. Buzzkill


    It looks like you are missing the space between {} and \ as explained in the directions.

  22. abdussamad

    regarding the -delete action:

    If you omit the * then you won’t get any errors and -delete will work

    find /path/to/files/ -mtime +5 -delete

  23. Pony

    I just wanted to say thanks for the feedback.

    I figured it out. Here is the Linux command I use on my Cron Job to auto delete all my cPanel backups that are 48 hours old or older. Everything is working great.

    find /home/username/ -name ‘backup-*’ -type f -ctime 1 -delete;

  24. larcin

    We currently are using this to delete files older than 60 days. “find installs/JS_* -maxdepth 0 -mtime +60 -type d -exec rm -rf {} \;”

    I was thinking that it would be better to set something up that would look at the the Newest file in the dir and then delete anything older than 60 days. Not sure where to start on that.

  25. josh

    I think my question’s already been answered but I want to make sure I understand this right:

    I have an FTP server for my firm. All the FTP directories sit at: /ftp/ftpsites/X/

    so if put a wildcard in for the X will that ignore the main FTP site but delete all the subdata? so /ftp/ftpsites/*/* -mtime +5 -exec rm {} \;

    I want it to save the parent directory, but that parent directory’s name will change depending on the FTP site.

    Thanks in advance for your help!

  26. Ahnolds

    Awesome command. I’d like to add that it works on macs as well (no surprise for a *nix based system). Solved the annoying problem of my downloads folder getting cluttered because it keeps *everything* I ever save :)


  27. Sir Hoagy

    How very cool this command showed up on “stumbleupon” when I just spoke with my friend today regarding reaping old files on our Linux server . We were going to have a cron command every Mon/Wed/Sat but this….THIS frickin’ is sweet! So much thanks!

  28. Matt Wells

    I run a script that takes the client’s URL and puts it into a file called database on my shared cPanel hosting. I want to empty that file once a day. What is the command line that I would enter into the advanced cron editor? I am probably wront but would it be something like:

    cp dev/null /home/username/public_html/directory/cgi-2009/database

    Please let me know..

    Thanks Much,

    Matt W.

  29. Offer

    Hi all,

    This is all great, but I still can’t figure something out…
    lets say I want to clean /tmp every day.
    Let us say file /tmp/dont_delete_me.txt is 7 days old. The process creating the file is still up and holding the file open. For this example let us say the process will write a single line every once in a while…
    If I would try to delete the file, what will happen to the holding process? Is there a way to know if a file is in use/open?

    I would like to clear a production server /tmp directory but never harm any running process…

    Offer Baruch

  30. Artelius

    In response to Offer (re deleting files which are still open in some process)

    Under UNIX, the filesystem is like a “map” you use to locate files. Sometimes there are two different paths to the same file (these are called “hard links” and can be created with the ln command).

    When you “remove” a file, actually you are just removing one method of finding the file (this is why removing a file is sometimes called “unlinking”). When you remove ALL methods of finding a file, the file is really deleted and no longer takes up space.

    In your case, there are two links to the file:
    your program’s handle to this file

    If you remove the first of these, the file will not be deleted – it will still “exist” but will ONLY be accessible to the program that has the file open. When this program closes the file, the file disappears completely.


  31. spage

    Great tip. implemented this to clear stale mysql dumps.

    Offer… What about scheduling a cron to touch the file daily, weekly, or whatever? this will keep the file from aging and thus keep it from being deleted by this command?

  32. dennis

    a lot of smart people here … =)

    this tip and the comments help a lot.

  33. Philippe

    Artem Russakovskii and others, is there an argument for find, so that directories will NOT be removed?

    (From a NTFS slave-disc, I need to delete the 200 000 oldest and 50 000 newest files and only keep the directory tree and all files from a 20-day period. (Permission of NTFS files “cannot be decided”, so ownership is missing, which causes 250 000 questions “cannot move to trash, cancel/remove?” when I try using GNU desktop). (It is a “blue screen of death” disc of course…))

  34. Philippe

    Please remove my post, I discovered faults in it!

  35. Ovidiu

    trying this on Debian 5.0 and getting an error, could you maybe help me figure out what is wrong?

    h1550830:/var/lib/amavis/tmp# find /var/lib/amavis/tmp/* -mtime +12 {} \;
    find: paths must precede expression: {}
    Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path…] [expression]

  36. Ovidiu

    I even tried this to no avail: find -path “/var/lib/amavis/tmp/*” -mtime +12 {} \;
    and this find -path /var/lib/amavis/tmp/* -mtime +12 {} \;

  37. goly

    Iif you just want to list the files (*not* remove them), remove “{} \;” part from your command:
    find /var/lib/amavis/tmp/* -mtime +12
    If you want to see the list of files, try this:
    find /var/lib/amavis/tmp/* -mtime +12 -exec ls -al {} \;

    Check out find’s manual regarding “-exec” option:
    -exec command {} +
    This variant of the -exec option runs the specified command on the
    selected files, but the command line is built by appending each
    selected file name at the end; the total number of invocations of
    the command will be much less than the number of matched files.
    The command line is built in much the same way that xargs builds
    its command lines. Only one instance of {} is allowed within the
    command. The command is executed in the starting directory.


  38. Vince

    this one will run faster

    find /path/to/files* -mtime +5 | xargs rm

    (The xargs command enable users to run a single command on many files at one time)

  39. Bushrod

    Something like this might be a bit safer for xargs:

    find /path/dirs* -maxdepth 1 -type d -mtime +5 -print0 | xargs -0 rm -rf

    This might be overkill (depending on your find path):

    find /path/dirs* -maxdepth 1 -type d -mtime +5 -print0 | xargs -0 rm –preserve-root -rf

  40. sujit

    Hi, guys, can you tell me how to find ssh login logs in SuSE ? Also, how to log the information like CPU load, Memory usage, disk space etc. at the time of possible hang up or crash ?

  41. Adam

    Tremendous – this is so helpful. I used this command in a cron job to delete large audio files (every 7 days) that my clients’ clients are uploading to her server. It was really filling up quickly, and this is a brilliant solution. Thanks everyone.

    find public_html/uploads* -mtime +7 -exec rm {} \;

  42. Gourav

    Very helpful, and very well explained :)

  43. RkaneKnight

    If you bother reading far enough down the comments, the {} actually is a placeholder for the filename that is found by the base command
    find /path/to/files* -mtime +5

    so if it found a file name /path/to/files/foundit.htm then it executes a command of
    rm /path/to/files/foundit.htm

    I always use the {} just because I may want to do other thing than -delete. :)

  44. Someone

    Thanks for great code HTG! :-)

  45. christine

    I think I love you… :)
    This is exactly what I needed thanks!

  46. Brett

    Worked nicely. Thank you.

  47. sumir

    Huge Thanks… Saved a lots of time…

  48. omar

    find path -type f -mtime +7 | egrep ‘*.log$|*.out$’

    i can able to get seven day ago modified file, but still some of the file is used by the process. how to find out that using commands

    eg :- on the file
    fuser /usr/appl/xat/runtime/xat_admin_server/els_xml.log
    /usr/appl/xat/runtime/xat_admin_server/els_xml.log: 27120

    PID 27120

  49. Qasim

    I’m using linux, by using below command, it shows results of last two days, but system don’t respond while using ‘-mtime +2’ ?? Why?
    find /path/to/file* -mtime -2

    Secondly I want to keep yesterday backup, so I need you guys help to delete files elder than yesterday (da day before yesterday).

    Please share the findings!! Thanks

  50. Shashikanth Prasad


  51. rino

    is there any way we can mark posts like this as favorites in htg? it would tremendously help in recalling it. of course one can always use evernote to gather such good articles.

  52. MJS

    Really Thanks

  53. sss

    That’s was a great post.
    Now I can create a one liner script and call it in a cron job.
    Saved a lot of time!

  54. Naveen

    Great!!!!! Working Beautifully….

  55. Martyn

    Hi guys,

    Great link thank you. I am wanting to use similar to delete files over 5 days (for example) but in that directory only. I have subdirectories which the find expression rightly finds the files which match the parameters however I don’t want those deleting.

    I have looked at the find man file and get the impression this is not the expression for me (cannot see a way of limiting to the directory only and no subdirectories).

    Does anyone see a way of overcoming this that I have missed?

    Cheers for a great post.

  56. Stuka

    Funny thing is, I am doing this to a usb pen drive (FAT).

    So everytime I run the command:

    find -mtime +5 -exec rm -f {} \;

    it gives me the message:

    FAT: Filesystem panix (dev sdb1)
    fat_free_clusters: deleting FAT entry beyond EOF
    File system has been set read-only

  57. Adrian

    my nagios stops because of too many files in /usr/local/nagios/var/spool/perfdata/
    now I’m back on business

    find /usr/local/nagios/var/spool/perfdata/ -maxdepth 1 -mtime +10 -exec rm -f {} \;


  58. Rizal

    Thank You, Thank You, Thank You :) … Wonderfully help !!

  59. Autonomous

    One way to get around the issue with rm being aliased and asking for confirmation even when specifying the -f parameter is by using the full path to the rm command

    In my case that’s /bin/rm -f

    Adding a path to ‘find’ (which I needed to do to get it working as a cron job) and deleting files older than 14 days makes the entire command

    /usr/bin/find /path/to/files* -mtime +14 -exec /bin/rm -f {} \;

  60. Markus

    Thank a lot, thats fantastic and saves a lot of time!

  61. sanmayas

    i have one doubt regarding log question is using find command how i can delete last 7days file.
    Could you please help me.
    find/var/log* -mtime+7 exec rm {} /; (this answer is correct or not)

  62. shergill

    hi once you deleted the file how can you create a backup directory created for today’s deleted files /backup/core_files/

  63. Stephen Character

    Is there a way to make it exclude certian files? I have them set as +i chattr set, but I keep getting prompted about these files and have to manually press n and enter

  64. Bolwerk

    I used

    find . -mtime +30 | xargs rm

    when dealing with a huge directory of mail where I wanted to delete everything older than 30 days. It was choking when I used the * wildcard.

  65. Martin Rønde

    “It was choking when I used the * wildcard.”

    Choking in the rm command happens on older linux/uni systems because the “*” expands into at command that is longer than 1024 charakters. Thats why xargs is nice …

    regards Martin Rønde Andersen

  66. tom

    Works also in SUN solaris :)

  67. Foo

    Muy Bueno!!

  68. Naushad Farsi

    anybody help me to” write a shell script to delete all directory which has been created before a perticular date” e.g 3may 2011

  69. Martin

    How can we make the delete script for files that X days old without the find command?

  70. LUCKY

    Write a shell program to delete all directory which has been created before a
    particular date?

  71. LUCKY

    Write a Program in shell, which accepts two files as arguement and then check whether the content of these files are same. If the content of both files are same , then program will delete the second

  72. LUCKY

    Write a shell script which receives any year from the keyboard and determines whether the the leap year r not. If no arguement is supplied then the current year is assumed.

  73. LUCKY

    Write a shell program that accepts a name from the user and check that the given name is a valid user of system or not and then display the appropriate message

  74. LUCKY

    Sir…… plz help me

  75. TechGuyKevin


    To remove a directory just add -type d to the mix:

    find /path/to/files* -type d -mtime +5 -exec rm {} \;

    The way that I use this is to remove directories for old projects which are all in the same folder. This way the find is completed quickly and you don’t accidentally remove older directories within newer ones :). Of course using find . means it must be ran in the directory containing the other directories. The . should easily be able to be replaced with a path.

    find . -maxdepth 1 -type d -mtime +5 -exec rm -rf {} \;

    And of course if you just want to list them:

    find . -maxdepth 1 -type d -mtime +5

    If you want to list them with info so you know what you are about to delete:

    find . -maxdepth 1 -type d -mtime +5 -exec ls -dlh {} \;

  76. miky

    hi….i have a problem: i must create a program that deletes at a specific period of time temporary files….

    help…plssssss :D:D

  77. Tracin

    Thanks Melinda…u saved my time…

  78. Jirapong

    Good stuff, thanks

  79. Abhishek

    This helped a lot. Gr8. Thanks


    we want to delete 7 day’s old directories in sun Solaris os any one help me

  81. Sarist

    I tried to delete all the files in a directory older than 2 months and it would not let me.
    I type in find /zone/group/results/ -mtime +60 -exec rm -f {} \;
    For every file it said cannot remove: /zone/group/results/filename is a directory.
    Can anyone tell me what went wrong?
    Thanks in advance


More Articles You Might Like

Enter Your Email Here to Get Access for Free:

Go check your email!