Points in Focus Photography

How I manage Lightroom’s Catalog Backups

I don’t usually do the whole computer tips thing, maybe I should do some more of it. The impetus for this post was a friend of mine who had close to 100GB of Lightroom catalog backups—and that was less than a year of weekly backups—sitting on his computer taking up precious space on yet another drive stretched to the limits.

Lightroom’s catalog backups are merely a copy of the currently open .lrcat catalog file. If your catalog is big, the backup will be as well. So for example, my 30,000-image catalog takes up almost 700MB of disk space. Every week or so it gets backed by Lightroom, after 4 weeks there’s a total of 2.8GB of backed up catalog files in addition to the 700MB active catalog sitting on my drive. Perhapse not significant for me, but when you have a 2G catalog and you don’t keep on top of manually removing the backups you can very quickly end up with a whole lot of space taken up by the backups.

Compression

I approach the issue size with the catalog and it’s backups in a multi fold way. To start with, I use Window’s built in NTFS compression to compress my catalogs and their backups. On average, I see about a 50% reduction in Lightroom catalog file sizes due to NTFS compression.

I would point out, that while it might seem similar to “zip” or “compressed” folders, NTFS (file system level) compression is not the same thing. While you can zip the backup catalogs manually, you cannot compress the main catalog that way and still be able to use it.

As I understand it, MacOS also offers file system level compression for HFS+ formatted volumes. Unfortunately, from a cursory Google search there doesn’t appear to be an easy way to enable or disable it without dropping into the command line. Moreover, since I don’t have a Mac, I can’t walk through the enabling it. Further Apple recommends against using HFS+ compression for compatibility reasons. In short, you might be able to do something with file-system level compression, or not, I simply don’t know.

NTFS compression can be enabled though the Advanced button in the Folder Properties (right click -> properties) dialog.

Getting Fancy with PowerShell and 7-Zip

NTFS compression is good, but not that good. My 696MB catalog compresses with NTFS compression to 296MB, but if I use 7-Zip to compress it into a 7Z archive file, it shrinks down to a mere 24.6 MB.

Just compressing it is good enough to save tons of space; throwing away old copies take that even further. Unfortunately, carrying this out is a little more complicated than just turning on NTFS compression.

Normally for scheduled task kind of work, I turn to bash on a Linux box, this is also what you mac users would want to use.

Initially I had envisioned a rather simple script that duplicated the command find $lr_bak_path –directory –mtime +30 –delete –print. In short, delete every directory that’s older than 30 days.

That was the initial thought, after playing around in PowerShell for a while I came up with the script published later in the article. Instead of deleting based on time, it deletes based on the number of backups I want to keep. That is, it’ll keep the last 4 backups even if they are 2 or 3 months old. It also compresses the backed up catalogs into 7-Zip archives, so I get massive storage savings while retaining backups of my catalog.

One hurdle to running PowerShell scripts is that Microsoft sets PowerShell by default to not execute scripts. To fix this, you need to run set-executionpolicy from an administrator PowerShell prompt, as show below.

PS C:windowssystem32> set-executionpolicy RemoteSigned

There are several options you can pick, but RemoteSigned is most secure without also requiring you to sign your own scripts.

I also elected to use 7-Zip to compress my backups. I’ve added a check for it being installed in the event that you have it installed too, or you can download it from 7-zip.org and install it. Two keep points, you’ll need to edit the $lrbackup_path line to point to where your Lightroom catalog backups actually are, second you might want to change the $backups_to_keep line to however many old backups you want to keep.

I automated the whole shooting match by running the script as a scheduled task though the Windows Task Scheduler. Getting the task scheduler to work is a bit tricky. You have to invoke powershell to run the script, not just the script. In the action dialog, the program/script will be %SystemRoot%system32WindowsPowerShellv1.0powershell.exe. Then to run the script, I found that setting the argument to -NoProfile -File "%path_to_script" -ExecutionPolicy RemoteSigned, worked for me.

If I there’s demand for it, or I have some free time, I’ll look into seeing if i can put together a video walk though of the whole process. In the mean time, here’s the PowerShell script I’m running.

Note: This script requires PowerShell version 3 to be installed (it should install with the .NET framework 4.5, if not you can download it from Microsoft here. At least PowerShell version 1 lacks the -directory switch for the get-childItem commandlet. You can tell what PowerShell version your computer has installed by opening the PowerShell prompt and typing $host.version.

The Script

# Script to compress and clean up Lightroom backups
# This script is provided as is and without any warranty or support.
# From: http://www.pointsinfocus.com/learning/digital-darkroom/how-i-manage-lightrooms-catalog-backups/

# Change this value to point where your Lightroom Backups are stored
$lrbackup_path = “$HOMEPicturesLightroomBackups”

# Change this value to
$backups_to_keep = 4

# Delete the following line after the first run of the script
$whatifpreference = $true

#
# Do not Edit beyond this point
#
$7z = Test-Path 'C:Program Files7-Zip7z.exe'

function create-7zip([String] $aDirectory, [String] $aZipfile) {
  [string]$pathToZipExe = "C:Program Files7-Zip7z.exe";
  [Array]$arguments = "a", "-t7z", "$aZipfile", "$aDirectory", "-r";
  & $pathToZipExe $arguments;
}

# Remove everything but the last 4 backup directories.
Get-ChildItem $lrbackup_path -Directory | Sort-Object LastWriteTime -Descending | select -skip $backups_to_keep |% { Remove-Item $_.fullname -Recurse -Force }

if( $7z ) {
  #Compress and remove archives.
  $archives = Get-ChildItem $lrbackup_path -Directory
  foreach ( $archive in $archives ) {
    create-7zip ( $archive.FullName ) ( $archive.FullName )
    Remove-Item $archive.FullName -Force -Recurse
  }
  # Trim number of backups to $backups_to_keep.
  Get-ChildItem $lrbackup_path -Filter "*.7z" | Sort-Object Name -Descending | select -skip $backups_to_keep |% { Remove-Item $_.fullname -Force }
}

Comments

There are no comments on this article yet. Why don't you start the discussion?

Leave a Reply

Basic Rules:
  • All comments are moderated.
  • Abusive, inflamatory, and/or "troll" posts will not be published.
  • Links to online retailrs (eg., Amazon, Ali Express, EBay, etc.) either directly, or indirectly through 3rd party URL shorternrs, will be removed form your post.
  • Extremely long comments (>1000 words) may be blocked by the spam filters automatically.
  • If your comment doesn't show up, it may have been eaten by the spam filters; sorry about that.
  • See the Terms of Use/Privacy Policy for more details.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.


Follow me on twitter for updates on when new comments and articles are posted.

Email Notice Details: By checking the above checkbox, you are agreeing to recieve one email at the email address provided with this comment, for the sole purpose of notifing you that the article author has been reseponded to your comment.

Our cookie and privacy policy. Dismiss