I made a mistake in configuring logrotate
on a new Linux system, and almost ran into a problem because of that. Fortunately I saw the problem before it became a BIG problem, but as a result, I decided to add a script to my Linux system to check for large files, typically log files that have grown out of control for one reason or another.
Here then is a simple Linux shell script I named LargeFileCheck.sh, which searches the filesystem for files that are larger than 1GB in size:
#!/bin/sh filename='bigfiles' find / -maxdepth 6 -type f -size +1G > $filename count=`cat bigfiles | wc -l` if [ $? -ne 0 ] then date >> $filename mail -s "Large log files found on server" YOUR_EMAIL_ADDRESS < $filename fi
If the script finds any large files it sends me an email, otherwise it takes no action. Note that you can modify the find
command maxdepth
setting as desired. I used 6
because some of my log files may be that deep under the root Linux directory.
Size settings
Here are the notes from the find
command man page on specifying the size
setting:
-size n[cwbkMG] File uses n units of space. The following suffixes can be used: ‘b’ for 512-byte blocks (this is the default if no suffix is used) ‘c’ for bytes ‘w’ for two-byte words ‘k’ for Kilobytes (units of 1024 bytes) ‘M’ for Megabytes (units of 1048576 bytes) ‘G’ for Gigabytes (units of 1073741824 bytes) The size does not count indirect blocks, but it does count blocks in sparse files that are not actually allocated. Bear in mind that the ‘%k’ and ‘%b’ format specifiers of -printf handle sparse files differently.The ‘b’ suffix always denotes 512-byte blocks and never 1 Kilobyte blocks, which is different to the behaviour of -ls.
Crontab entry
I run the script once a day with a crontab entry that looks like this:
30 2 * * * /var/www/scripts/LargeFileCheck.sh
Summary
If you need a script (or just a find
command) to search your Linux system for large files, I hope this is helpful.