Even the largest hard drive has a limit. From time to time, you have to clean up some file to be able to work on some new projects. Working on multiple WordPress projects, old and new ones, can use up a lot of storage. In a previous blog post from earlier this year, I’ve showed you how to save storage by loading images from a live website. As images and other files in the uploads folder are usually the largest parts of a WordPress website, this already helps a lot. But there is another type of files, that can easily fill up you hard drive in now time: database dumps.
Finding large database dumps
When you work on projects that have already gone live, you probably want to get the latest database from the live website and import it locally. But you might also have done some settings in your local environment, so you make a backup of the local database before replacing it with the live one. That’s already two dumps. Then you play around a bit with the content and just in case make some more backups. And as some WordPress databases easily have some hundred megabytes, your free disk space is reduced quite fast. So the first step you should take is to find all those large database dumps on your local disk. For that, you can simply run this command in the main local development folder:
find . -type f -size +10M -name "*.sql"
This command will find any SQL dump that is larger then 10 MB. You might even want to search for smaller files, if you have lots of them which sums up to some hundred megabytes in total. If you wand to see how large all of these files are, you simply extend the command and run it through the du
command:
find . -type f -size +10M -name "*.sql" -exec du -h "{}" \;
Now that you have found all of these files, you can delete those you really don’t need anymore. But what about files you still might need?
Compressing all database dumps
Well, you can simply use another command and compress all of them in one go. You just have to run this command:
find . -type f -size +10M -name "*.sql" -exec gzip "{}" \;
This will compress all database dumps larger than 10 MB using the gzip
command. I’ve run that command on my work laptop some days ago and was able to save more than 3.5 GB on my drive. It only took about 90 seconds to run. If I need to restore any of the compressed dumps, I would use ungzip
them before and then import the dump into the database.
Find other large files
Once you’ve done with compressing all database dumps, why not search for other types of large files that can easily be compress or deleted? Some other good candidates are logfiles, like the debug.log
you will find in the wp-content
folder, when you’ve activated the WP_DEBUG_LOG
in an installation. To find those files, either replace the search pattern or just remove it completely, to find all large files:
find . -type f -size +10M -exec du -h "{}" \;
Searching without a file extension will probably also show you a lot of media files, that are easily large as well. But then you might use the trick mentioned earlier to load those from a remote server.
Another type of files you may find are XML files. They may come from the WordPress XML Exporter, but they could also some other type of files, that needes to be left uncompressed.
Caveat
Some of you may think now “Why not just compress all .sql files?”, but this can cause some issues. If you search without the size filter, you will find some files within plugins. Those files are usually used to create the necessary database tables, when installing/activating the plugin. If you compress them, the plugin can probably now longer use them. So don’t use a to low file size in the filter. On my system 1 MB was even small enough to catch all real database dump but not those special files. The same is true for some XML files mentioned earlier. And you can’t compress logfiles, if you still want to write to them. So don’t just compress every text file that is too large.
Conclusion
Having a lot of projects on needs a lot of disk space. Knowing how to save some of this valuable space is crucial, especially when you can’t simply upgrade the hard drive in your hardware. Compressing all those database dump quickly cleanup up a lot of disk space for me in no time and without the need to decide on every file, weather I still need it or not.