Jeremiah Rogers

About Subscribe Gallery

Processing Thousands of Images in Lightroom

I’m becoming a harsh editor of my own work. In the process of switching to a new computer I just went through my entire six month photo catalog and deleted every image I didn’t love. I’ve ended up with 500 photos to define my trip. It’s still far too many images, but it fits comfortably into only 10 gigabytes of hard drive space.

The benefit of having only images that I love on my disk is obviously that I save space but also that my standard is going up. I no longer want to add shitty images to the collection, there are no shitty images in the collection, and I have an easier time showing people my best work.

I’ll note that deleting so many photos is not really a big deal because I now back up my photos to Amazon S3. The backup is push only, so images are never actually deleted. I wrote a quick command line script to process all of my DNG files into 10-50kb JPEGs before deleting them so I can still browse the full library for a lost image and pull it off S3 when I need it.

Amazon S3 pricing is super low now and securely backing up my full 80 gigabyte extended library costs less than $2 per month. I don’t expect to ever go through those discarded images again, but they are safe just in case.

While deleting all these pictures I came across a good heuristic for trimmining photo libraries that may be obvious to any computer scientist but wasn’t to me.

How I Pick My Best Images

Let’s say I have 10 portraits of the same person under the same light in the same location. Basically it’s 10 copies of the same shot and I want to pick the best one.

  • First, make sure all images are unflagged in Lightroom (as they are by default) and set Lightroom’s filter to “unflagged only.” Now when you flag an image as a pick by hitting “p” or as a rejection by hitting “x” it disappears from view immediately.
  • Assume that image 110 is the best and don’t reject it or pick it. This way it stays in view until you’re done.
  • If image 210 is worse than image 110 then reject image 210. It disappears immediately from view. Otherwise reject image 110.
  • Now you’ve got nine images. If image 29 is now worse than image 19 reject it, otherwise reject image 19.
  • Continue this process until you’ve got only one picture left. Mark the one picture that remains as a pick by hitting “p”.

For me this process is much better than picking and rejecting images as I go through them. Many times I take a shot a few times to make sure it isn’t blurred and end up with 10 identical images. This way I make sure that I get the best one without flagging three or four as keepers.

My camera recently passed 18,000 exposures, that means I’ve shot about 100 pictures per day over the past six months. As I said, my “keeper” library is only 500 images on my machine and consumes only 10 gigabytes of space. Cutting the library from the 5,000 images I’d kept already to 500 images was easy this way.

My standards change over time, so I don’t expect that anyone else would be keeping 1 of 36 photos like I do. Images I used to love no longer meet my standards. Further, I’m actually shooting much more liberally now that I figured out an easier way to manage my library.

Making Lightroom Faster

Check out Adobe’s full list of tricks for making Lightroom faster.

By far the top tip I’ve found is to generate standard-sized previews before I start working with images. It’s easy to set this up to kick off immediately after import. Since my computer has a high resolution screen I make Lightroom generate the largest possible standard-sized preview (about 2,800 pixels wide).

If your computer is faster you can generate 1:1 previews. I don’t, because they take 20-30 seconds for me to generate and take up a lot of space on my machine. Standard previews give me most of the performance benefit and only take a few seconds to generate.

Script for Creating Thumbnails

I wrote a simple script to take a directory of Leica M9 DNG files and convert them into small JPEGs. My output varies from 10 to 50 kilobytes each depending on the input image complexity. You can grab a copy here.

The script takes about 30 seconds to process each image. It’s not ideal, so I let it run over or when my computer is plugged in.

Backup Script

I find that having bad images in my library clutters my mind. Like many people I’m a packrat and want to keep thousands of images around “just in case.” It’s ultimately a waste of storage space and lowers my photography standards.

When I first import pictures I export low resolution 10-50kb JPEG images to a folder called “filmstrip” in my Dropbox. You can find instructions for that above. I can use these lightweight reference JPEGs to quickly find a file on S3 even if a backup isn’t available locally.

After creating the reference JPEGs I backup every raw file to Amazon S3 using “s3cmd.”

s3cmd sync /home/jeremiah/Travel_Imports/ s3:// --no-delete-removed -r -v --multipart-chunk-size-mb=5

You can install s3cmd easily on Linux or a Mac, but at least on Linux it’s built into package managers. Lower “multipart-chunk-size-mb” on slow connections so that it doesn’t time out.

Once this step is done I’m free to delete every image from my local library that I don’t think is outstanding.