jbminn

2 minute read

Every sys admin needs to quickly free up disk space from time to time, and here are two well-tested scripts I use for exactly this purpose.

The first is a simple command-line use of exec that uses find to construct of list of matching file names, in this case all gzipped tarballs, and then runs the rm command on them.  The curly braces “{}” act as a holder for each file name from the list, and the trailing “” is used to escape the “;”, so that it is passed to find as a literal command terminator.

#find . -name *.tar.gz -exec rm {} ;

You may find (pun intended) that you need a slightly more sophisticated way to construct the list of things to remove.  For example,  to recursively delete all files & directories from an arbitrary depth in the file-system, except for special files or directories.  To accomplish this, I use a small Perl utility I wrote that uses the File::Find module.

Here is the actual script I use generalized out with a fake $top and /some-pattern-to-exclude/. This script will crawl down (actually the direction that finddepth goes is up) a list of files & directories, and unless the pattern is matched, each file in the directory & then the directory itself is deleted.

How does this work?  Pretty simple – the File::Find module loads some routines, one of which is finddepth. The first argument it expects is a routine to run against all files & directories found, starting at the location named in the second argument.  The wanted routine is run on each element in finddepth’s list & you can use  finddepth’s variables to make your wanted routine smarter.

 

 

Tags: , ,