Elliott Mariess | Photography

View Original

Honest Review of Aftershoot AI Culling

Disclaimer: This is an honest review based on my personal experience with Aftershoot AI culling software. I have not been paid or incentivized to write this review, nor am I offering referral links. I paid for a year's subscription to test it thoroughly. The images shared in this post are screenshots of raw, unedited photos and include rejected images. They do not represent the quality of my final images or overall work—they’re merely for educational purposes.

Example Final selected image (done manually)

Why I Tried Aftershoot

As a wedding photographer, culling is one of the most time-consuming parts of my workflow. I typically shoot between 3,000 and 5,000 images during a wedding and deliver a curated selection of 500-1,000 images depending on the day’s length or how much there is to capture. Aftershoot’s AI culling feature caught my attention as a potential way to speed up this process. Since I have my own process and style for color grading that I have refined over the years in Lightroom, I chose the culling-only subscription to see if it could streamline that part of my workflow.

To thoroughly test it, I ran several past weddings through Aftershoot to compare its results with my Lightroom selections. Here’s what I found.

Example Final selected image (done manually)

Example Final selected image (done manually)

Case Study Results

For one recent wedding, Aftershoot culled 4,800 images down to 1,500—a far cry from my actual Lightroom selection of around 580 images. I set the software to be as stringent as possible, intending for it to only select the absolute best images. Unfortunately, the results were less than impressive.

What Worked

Aftershoot’s speed is undeniable. It processed thousands of images much faster than I could manually. The interface is straightforward, and it’s easy to see how some photographers might find value in its automation if they don’t have a highly curated selection process.

What Didn’t Work

Despite its speed, the AI’s decision-making felt random and often counterproductive:

  1. Over-selection of Burst Mode Images: During key moments, I often shoot in burst mode to ensure I capture the perfect frame. From a series of burst images, I might keep one if I actually liked the moment. Aftershoot, however, often kept eight or more nearly identical shots, none of which I would’ve actually chosen.

  2. Irrelevant 5-Star Ratings: The software rated unusable images as five stars, including:

    • Blurry shots of feet triggered accidentally.

    • Test shots where I was adjusting exposure.

    • Completely out-of-focus images or ones taken during focus stacking. (ring shots)

  3. Key Moments Rejected: Worryingly, Aftershoot rejected crucial moments such as:

    • Intimate interactions between a mother and daughter during preparations.

    • A video call between the bride and her sick father.

    • Fun and emotional moments during the first dance.

    • Key moments during vows

    • Hugs with Friends/Family

    • The actual first kiss—(it selected several frames before instead.)

  4. Inconsistent Ratings: When comparing groupings of similar images, Aftershoot often rated the least meaningful photos as five stars while giving lower ratings to better-composed, more impactful shots.


Wrongly selected examples:

Over-selection of Burst Mode Images

Test shots where I was adjusting exposure

Blurry shots of feet triggered accidentally

out-of-focus images taken during focus stacking

Over-selection of similar images (focus stacking images)


Wrongly Rejected Examples

Intimate interactions between a mother and daughter during preparations.

Fun and emotional moments during the first dance

Key moments during vows

Intimate interactions between a mother and daughter during preparations.

A video call between the bride and her sick father.

Hugs with Friends/Family

The actual first kiss—(it selected several frames before instead.)

Inconsistent Ratings

The Fundamental Problem

The core issue is that the AI lacks context. It simply doesn’t understand:

  • Why an image might be meaningful.

  • My shooting style or preferences.

  • The emotional weight of certain moments.

As a result, I found myself spending more time reviewing its selections than I would have spent culling manually. Instead of saving me time, it created additional work.

Final Thoughts

Aftershoot’s AI culling feature feels like employing an inexperienced junior assistant. It doesn’t understand your vision, your style, or the importance of certain moments. I wouldn’t trust any photographer who relies on this software for client delivery without manually reviewing every selection.

For me, this experiment reinforced how crucial the culling process is. It’s not a step that can be shortcut, no matter how advanced the technology claims to be. Context, taste, and firsthand experience from the day are irreplaceable.

Example Final selected image (done manually)

Recommendation

I cannot recommend Aftershoot or any AI culling based on my experience. If you value control and quality in your workflow, you’re better off sticking to manual culling. While AI software might improve in the future, it currently falls short of being a reliable tool for professional photographers.

My key takeaway: Selecting the right images is an art as much as it is a skill, and no AI can replicate the human touch when it comes to understanding what makes a moment truly meaningful.