Points in Focus Photography

Lightroom 8 Years Later: A Critical Look at Virtual Copies

In the first couple of articles in this series I’ve looked at some of the more broad user interface points about Lightroom. Now I want to start getting into some more technical aspects, starting with virtual copies.

As I’ve said over and over, one of the biggest features of Lightroom is that it doesn’t directly manipulate the pixel values of the images in the catalog. Every image you see in the interface is made up of two “parts”. First, there’s the original image file, be it raw, jpeg, tiff, psd, or what not, on the disk. Second is the set of instructions that tell Lightroom how to process and display an image; these are stored in the catalog.

One of the advantages of having these two separate parts to an image (the raw file and the recipe) is that it enables a very space efficient way to address multiple alternative versions of an image. You don’t need to store completely separate files on disk, you just need the bookkeeping in the database for two images that point to the single file on disk.

It’s not like the space savings of virtual copies is something to sneeze at. The develop recipe can be as little as a few KB. Even big develop recipes — and I’ll be going into much more depth about the storage of Lightroom’s develop settings in a future article — will almost always be under a few hundred KB. Compare that to the 3–6 MB needed for a 10 MP JPEG, never mind a 50MP raw at 60–70 MB, or the even larger file sizes needed for fully converted RGB tiffs or PSDs.

Make no mistake, Adobe unquestionably got the overall concept of virtual copies right. However, there is one major detail that just drives me up a wall.

Adobe decided for some reason, to impose an artificial distinction on the images in Lightroom. There are Master Photos and Virtual Copies, and these two “types” of images behave differently in certain subtle but important ways.

Master photos are the first entry in the catalog that’s created when and image is first imported. Beyond that there’s nothing special about them other than that artificial distinction and the subtle differences in behavior.

I can only speculate as to why Adobe chose to do this, but if I had to guess I think maybe Adobe was trying to retain some kind of skumorphic analogy to film. That is, the first slide/negative is the master image, from which you make copies and do your work. However, such a distinction doesn’t exist in digital; all copies are identical to the source they came from. When all copies are identical until you change them, the impetus to track the “first” or even generations at all is almost completely negated.

Moreover, it’s not like there’s a clear technical limitation that requires this behavior. I’ve dug through the catalog. Adobe properly normalized the file references from the image table. Many images can refer to the same files. The distinction is in fact extra information that’s stored.

It would seem, to me at least, that Adobe thought there was some benefit to this. Certainly you can filter your library by master and virtual images. But I have to be honest, I’ve never really seen the utility to this.

Since I hinted at subtle differences in behavior I should probably enumerate them. So the first difference is that you can filter the catalog based on whether an image is a master image or a virtual copy.

The second difference has to do with deleting, or removing the image from the catalog. When you delete a virtual copy, Lightroom only removes the virtual copy from the database. The image file remains on disk, and the master image and all other virtual copies remain in the catalog. However, when you delete a master image, Lightroom removed all of the virtual copies that are derived from that image and potentially also the raw file stored on the disk.

Lightroom’s master images are “more important” than a virtual copy, even though they’re technically the same thing as far as the database is concerned.

Perhaps the best way to understand my complain is though an example.

A lot of times I’ll work many similar variations on the processing of an image to find the way I like the best. I do this by making a virtual copy and tweaking the settings repeated. In the end I may have a processed master photo and a half dozen slightly different virtual copies. Even though virtual copies don’t take up much space, I find having a lot of extraneous images in my library less than desirable.

After my process of working up alternative images, I typically clean up the ones I don’t want and move the “choice” develop settings to the “master photo” to keep. With Lightroom’s implementation of virtual copies, to do this I have copy and paste the develop settings from my choice virtual copy to the master image and delete all the other virtual copies. What I should be able to do is just delete all the “copies” I don’t want and have the last remaining virtual copy become the “master” image.

The difference is not huge, and the workaround I use is workable. However, given that there are no real technical reasons that virtual copies need to be treated different than master photos, the process is just frustrating to me.

The other are where I think Adobe could further improve the use of virtual copies is with respect to publish services. One other desire I have generally is to insure stability of images in publish services. I don’t, for example, want to have a different version of an image listed on my site than what I have on my computer to make the print from. I do this by creating virtual copies to attempt to fix in place the settings. It would be nice, if there was a way to automatically create virtual copies on adding images to a collection or publish service.

So that about covers what I think I have to say about virtual copies in Lightroom. Again, Adobe did things pretty solidly. Internally, the catalog design is, so far as I can tell, sound in the implementation of virtual copies, and there’s nothing that would prohibit the more fluid interface, that I would prefer. Moreover, for the most part, the UI doesn’t suffer from a whole lot of major issues, and the benefits in storage space savings is significant as opposed to having multiple real copies to accomplish the same effect.

Next time I’m going to really burying into some technical details in the catalog as I look at the way Adobe stores develop settings, and the implications of that on the size and efficiency of the catalog.

Comments

Matt O’Brien

The Virtual Copy is a superb concept, but typical from Adobe. it is incomplete.

From my perspective it is fatally flawed, because Virtual information is not saved within the XML side car files.

Adobe could have solved this issue in one of two ways.

The concept of XML structure/schemas would allow Adobe to hold multiple Virtual children of a master image within a single XML side car file. Adobe decided not to do this.

Alternatively, Adobe could allow multiple xml files (ie one per master and virtual copy). But alas, such an option has not been implemented either.

I no longer use virtual copies. If I want a different version of an image I have an Export Preset, which exports a raw file as an ‘original’. I now do not have to worry about whether an image is virtual or not and can apply whatever develop settings I want to the second copy (including rename, move, tag as I wish, etc.. At any stage, I can use my actual images on my disk to rebuild my catelog, as I will not be missing my virtual images. With a folder of 500-600 images I may only need to do this once or twice, so the extra disk space required is trivial.

It would be nice if Adobe finished the Virtual Copy concept, but this is non critical for me as I am very happy with my work around.

    Jason Franke  | admin

    Hi Matt, welcome back.

    Interesting perspective.

    Alternatively, Adobe could allow multiple xml files (ie one per master and virtual copy). But alas, such an option has not been implemented either.

    I think this is the only viable option. Changing the XML schema would almost certainly be a breaking change to anything else that reads and uses the XMP format. On the other hand multiple XMPs would simply just not shop up in software that doesn’t support this kind of XMP based virtual copy mechanic.

Matt O’Brien

I agree, but as Adobe were the architects of the schema, it is disappointing they did not cater for the reality of multiple virtual copies. So be it. I have no doubt, if Adobe wanted to solve this issue they could.

Another factor is that many users, especially those new to Lr, are unaware that the only place Virtual copies exist are inside their catelog, losing their virtual images in certain cases.

If Virtual copies were stored inside the XML data, then a big advantage would be the ability to rebuild a catelog from the files within a folder or group of folders.

    Jason Franke  | admin

    Matt,

    I agree, but as Adobe were the architects of the schema, it is disappointing they did not cater for the reality of multiple virtual copies. So be it. I have no doubt, if Adobe wanted to solve this issue they could.

    XMP predates Lightroom. It was first introduced as part of Acrobat 5 in 2001. It was made a core part of Creative Suit in 2005. Lightroom development didn’t start until 2006, and virtual copies weren’t introduced until Feb 2007. I don’t thing anybody in 2001, or even 2005, was doing anything similar to virtual copies — I wasn’t doing photography at that time so I can’t speak from personal experience. All the photographers I know who were doing digital in that era were doing non-virtual copies of manipulated bitmaps and using simple file/thumbnail browsers.

    At this point, XMP is now an ISO standard so changing it is more of a hassle — which also why it’s probably easier to support virtual copies as multiple XMP files then changing the XMP standard to support virtual copies in a single file.

    Maybe they could have done something in 2006-2007, but I wouldn’t be surprised if there wasn’t a lot of resistance even inside of Adobe to changing the XMP format. At that point XMP was established, and they would have to worry about backwards compatibility with Creative Suite apps that were already released but had not been updated. And it’s not like they had a nice subscription model at the time that could push people along to the new version regularly.

    Another factor is that many users, especially those new to Lr, are unaware that the only place Virtual copies exist are inside their catelog, losing their virtual images in certain cases.

    IIRC, the defaults in LR are not to save data to XMP files at all. So fundamentally, except for completely unprocessed bitmaps, none of your “images” really exist outside of Lightroom anyway. If you remove them, you’ve lost that image whether it was a virtual copy or not.

    If anything I think Adobe’s perspective on this would be either a) add a “recycle bin” like function for Virtual copies when they’re deleted, and b) provide a more intuitive restore from backup capability. I don’t think Adobe wants people storing stuff in XMP files; not that I necessarily agree. XMP files are too “portable”, while the catalog gives them more leverage in terms of lock in.

Matt O’Brien

Thanks for the background re XMP. I was not aware of it’s origin. Useful to know.

However, the following is an extract from https://en.m.wikipedia.org/wiki/Extensible_Metadata_Platform

“XMP metadata can describe a document as a whole (the “main” metadata), but can also describe parts of a document, such as pages or included images. This architecture makes it possible to retain authorship and rights information about, for example, images included in a published document. Similarly, it permits documents created from several smaller documents to retain the original metadata associated with the parts.”

I do not claim to be an expert on XMP but the spec seems to allow for multiple documents/ images within a single XMP file.

It would be nice if Adobe made Virtual Copies a more complete solution, but anyway, it is always useful to understand the underlying architecture for apps such as Lightroom, which provide mission critical functionality for a lot of people around the world.

It looks to me that Adobe are in the “Do The Min Possible” mode for Lightroom, so not optimistic that we will see any effort on improving Lightroom “Usability” anytime soon. Hopefully I am wrong.

    Jason Franke  | admin

    Matt,

    I do not claim to be an expert on XMP but the spec seems to allow for multiple documents/ images within a single XMP file.

    I’m not an expert on XMP either, though I may look into that just out of curiosity.

    It would be nice if Adobe made Virtual Copies a more complete solution, but anyway, it is always useful to understand the underlying architecture for apps such as Lightroom, which provide mission critical functionality for a lot of people around the world.

    I agree.

    It looks to me that Adobe are in the “Do The Min Possible” mode for Lightroom, so not optimistic that we will see any effort on improving Lightroom “Usability” anytime soon. Hopefully I am wrong.

    I’m afraid of that too.

    Though I haven’t really touched on it in this series, but one of the places where I’m really starting to find Lightroom lacking is the development/rendering engine. It’s not bad, per say, but on the other hand its never been the greatest at resolving fine detail. Even more pressingly, perhaps, is that you have software like Canon’s Digital Photo Professional, that’s now capable of removing diffraction related softening. With high density sensors on crop cameras already being diffraction limited at f/5.6, or very shortly after, and most of the consumer lenses typically being around f/5.6, being able to process diffraction out of the image is a huge plus in retaining image quality for crop users and it’s something that Lightroom simply can’t do right now.

Leave a Reply

Basic Rules:
  • All comments are moderated.
  • Abusive, inflamatory, and/or "troll" posts will not be published.
  • Links to online retailrs (eg., Amazon, Ali Express, EBay, etc.) either directly, or indirectly through 3rd party URL shorternrs, will be removed form your post.
  • Extremely long comments (>1000 words) may be blocked by the spam filters automatically.
  • If your comment doesn't show up, it may have been eaten by the spam filters; sorry about that.
  • See the Terms of Use/Privacy Policy for more details.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.


Follow me on twitter for updates on when new comments and articles are posted.

Email Notice Details: By checking the above checkbox, you are agreeing to recieve one email at the email address provided with this comment, for the sole purpose of notifing you that the article author has been reseponded to your comment.

Our cookie and privacy policy. Dismiss