Chapter 13

Asset Performance & Conversion Reporting

I have long referred to Asset Performance and Conversion reporting as the “holy grail” of photo studio reports - and not without reason! After all, the ultimate goal of all product photography is to sell actual products. You could say all other reporting is secondary. If your studio's images aren't performing well for your company or client, you've got a serious issue to address immediately, and any talk of "studio optimization" or "process improvement" is probably going to take a back seat until products are filling customer's carts again.

There is endless writing about conversion optimization, conversion rates, and all the various tweaking of CTAs, design, copy and more that ecommerce businesses experiment with, much of it by UX professionals with more to say on the subject than an old studio vet like myself. The focus of this guide is still photo studios, so we'll be looking at the topic of performance and conversion through the lens of images and image creation – and not all of the other levers that can be pulled to optimize a site. Basically, the things that a studio can impact - by producing different (or more) images, that would ideally lead to better conversion rates.

Some Common Focus Areas When Looking at Asset Performance:

  • Recognizable models versus unrecognizable models
  • Shooting on-model versus shooting on mannequins
  • Costs per shot related to performance
  • Visitor path (where did they arrive from on your PDP)
  • Styling
  • Hero image (trying different angles)
  • Videos versus no video
  • Additional alts versus no additional alts
  • Talent/team responsible for images
Because conversion is linked to sales, the value of reporting on it is straightforward (duh, money). But if something is so obviously important, why is it so difficult to find studios able to exemplify this in action? The answers to this question are as varied as the brands, studios and retailers who shoot photography. But mostly, this disconnect can be boiled down to two common denominators: struggling
tech stacks
Your tech stack is the collection of technologies and tools your company uses to accomplish work. Your PIM, DAM, PM systems, production software, Capture One, Adobe Suite, etc are all part of your tech stack.
and
organizational silos
An organizational silo is an area of a company or organization that is operating almost separately from other teams. This means information, resources, ways of working, and reporting are not being shared effectively. Studios often exist in a silo!
. Let's take a look at each of these challenges, before we get into what you should look for when reporting on asset performance and conversion.
kpi-chapter-13-1
kpi-chapter-13-2

The Long-Suffering Tech Stack

Basically, if your tech stack isn't strong (or, at least, modern), this reporting isn't really going to be possible. Product data and associated imagery sometimes has a really long road from conception to production to publishing, and there is data and metadata needed from each step. Which is to say, by the time images end up being published online, they are often pretty removed from the context of how they were produced - unless your systems of record have been quietly recording events and metadata, building a fuller story of each image's path. Useful context is often lost due to the gaps in communication between various systems, making useful reporting on those things impossible. We're talking about things like: did our more expensive assets perform better? Did the assets we shot with the new style guide perform better? Is there a photographer that creates higher performing assets? What about a shooting style? On-model? Off-model?

How are so many studios in a position where they don't have access to answers to these kinds of questions?

For one, accounting software is often not linked to production software, making true costs per sho, or per product, difficult to estimate. Production software is often not linked to publishing software, making it difficult to easily know what the teams and talent were for a particular asset. Even in a good situation, some or all of this information is often buried in metadata, with no one really tying everything together for useful reporting or analysis. There may be a separate project management system, used by producers, or a model booking software used for tracking talent. Or, the majority of this work may be taking place “brute force” style, in spreadsheets.

At the very minimum, you'll want all of your tools to be able to share data, and to be able to produce useful data. For the "sharing data" part, this can be achieved through rolling up multiple systems into a business intelligence platform like Looker or Tableau. You can read more about these systems in Chapter 3: Systems and Data. For the "producing useful data" part, you'll want to be using a modern system for your productions, and structuring your data in a way that it can be extrapolated. While a colorful, hyper-formatted shared spreadsheet might work for one step of this process (say, tracking talent), it's unlikely that you could link this spreadsheet to a BI system and have structured data be exported. In practice, using the talent tracking example, this means your talent is not associated with the images that you produce. Even spreadsheet-esque systems like Monday or Airtable can integrate with data tools (somewhat), assuming they are structured in an organized fashion. That said, if you're working in a tool that's actually made for photography production, you'll have to do less work to wrangle useful data.

In summary: work in modern tools that are built for the job of photo production. And then roll the data from those tools up into one place. This will allow you the context that will be the key to understanding any results of asset performance and conversion testing.

Organizational Silos

The other, non-tech, reason for the relative rarity of useful asset performance testing with the studio is trickier: it's organizational challenges. Even in companies with a strong tech stack, these decisions and analyses are often made completely separately from studio partners. This is a lost opportunity, as the studio and it's team of creators would absolutely be involved and invested in performance testing. Chances are, if you are a brand or retailer, your company is already doing some form of conversion tests, like
A/B testing

A/B testing is a method of comparing two versions of something on a website, email, app or other published medium, against each other to determine which one performs better. This is done by serving two different experiences to users accessing content and observing how they perform.

In the context of photography, this can mean testing things like the following to see if the impact conversion:

- Test having product video versus not having product video
- Test more expensive model A versus less expensive model B wearing the same items
- Test multiple alt views versus less alt views
- Testing different styles of photography
. And chances are, most roles in the studio are not involved.

A good starting point is an example of a commercial studio whose clients are testing the images they produce. I've worked with commercial studios that will partner with clients to run A/B tests using images the commercial studio produces, objectively proving that these externally produced images perform better than the images the client was currently using. That's a pretty effective way to win business! These commercial studios were able to take themselves out of their own silo as a vendor, become a partner to the brand, and become invested in the success of the brand's asset performance together.

While this approach obviously makes sense for a commercial studio, who is directly invested in winning business, I would argue that it also makes sense for in-house teams. Don't let site merchandising teams or random creative higher-ups perform optimization testing with the studio's imagery in a vacuum! It's a huge missed opportunity for partnership. For one, the studio can provide valuable context and perspective about imagery that performs better or worse. Perhaps more importantly, studio teams should be clamoring for more testing and more shared results, adopting KPIs around asset performance. Rather than the more common experience, where testing results are delivered to studio teams from on high, with little context, studio teams should be partnering with their site teams, proposing their own tests, and seeing how they can move the needle on producing higher performing assets.

kpi-chapter-13-5
kpi-chapter-13-3

OK, But What Are We Looking For?

So, let's say your tech stack is modern and capable of producing and sharing useful data. Let's also say your organization isn't siloed and cross-functional teams are working together. Now let's talk about the actual things people are looking for when testing asset performance. As we mentioned at the outset, there is a whole art and science behind conversion testing. You don't need to be an expert on this from the studio side, you should be partnered with someone who can lead the technical aspects of this journey.

But you will want to know what the common things people test for are, what their goals are, and where those things interact with your corner of the business. Well, what folks are looking for is simple: conversions. And they're looking for your studio's assets to drive these conversions. We've all probably worked with that horrible Creative Director who just liked to give images a “thumbs up” or “thumbs down” without any explanation of why the images were good or bad. This is an exercise in the exact opposite: let's find out why we think these images work (or don't work). Each asset your studio creates has a number of properties, each of which can be seen as a “lever” to be pulled, and that means a test could be run by pulling this lever. Let's use the example of a brand selling wallets. They could be shot with a model pulling them out a pocket, looking like a “real life” use of the product. One lever you could pull would be to shoot the wallets on plain white tabletops, with no model. Tests are run between the two images, and you get some actionable insight into what works better, which is to say: what converts to a sale.

In addition to the properties an asset has, there are also properties that come from the user journey on the website itself. For example, how did a user arrive on the site? If users who are arriving on the PDP via Instagram links convert at a massive rate, while site users arriving from browser searches do not, your studio may be shooting a lot more social content for IG soon! The combination of these site attributes and asset attributes intersect to create the testing ground for your images.

What About KPIs?

Because testing asset performance is often tied to experimentation and the testing of things that are not necessarily in full production, they make a tricky subject for KPIs. It would be hard for a studio to gauge a team's performance based on video performance on the website when, for example, video is not being used site-wide. That said, a truly ambitious studio team could set a KPI for “image conversion to sales” rate, and aim never to go below it. For example, a goal could be set to “convert at 5% or higher on all 2022 imagery produced in Q4.” The conversion rate would then be a KPI the studio adopted. I can't say I've seen many studios tie KPI's to asset performance, but I absolutely think this will become more common as studio's tech stacks improve and they are rolled into the mainline business more cohesively.

In summary, there are many, many things to test for when discussing asset performance, especially when taking different industries into account. For everyone, though, the first step is having the ability to test, the partners to test with, and the drive to improve based on what you find.

kpi-chapter-13-4

Do

  • Break down organizational silos
  • Advocate for a modern tech stack that supports testing
  • Seek out opportunities to perform testing with studio assets

Don't

  • Make objective creative decisions
  • Forget about the UX impacts to online conversions
  • Test just to test: have specific goals or questions in mind

Revenue Per Site Visitor Report

streamline icon business team goal

Leadership Goal/s

Increase revenue

streamline icon business team goal

Studio Goal/s

Increase the average revenue per visitor by an average of $10 in Q3 and Q4

streamline icon business team goal

KPI

% of baseline average revenue per visit in Q3 and Q4 previous year

streamline icon business team goal

Data Sources

Studio Management Platform, Cross-functional web team reporting

streamline icon business team goal

Dimensions

This will likely be provided by an external team

streamline icon business team goal

Metrics

Average revenue per visitor in Q4/Q4

streamline icon business team goal

Report Type

Numeric (percentage)

More Conversion Per Visit - A Big Win

The revenue per visitor report measures just what it sounds like: the amount each visitor spends per visit to your company’s site. This is a somewhat muddy KPI for a photo studio, because so much is out of your control: are prices rising or falling? Is the new site redesign a confusing labyrinth? Is the latest batch of products flying off shelves, no matter how good or bad the photography is? These are all valid questions, and are all likely beyond a studio's control. However, this doesn't mean we can't set some goals, make KPIs to support those goals, and try to dive into the world of asset performance and conversion KPIs.

One of the best ways to tilt the scales on this KPI, and get your customer to spend a higher amount, is through cross merchandising and “shop the look” initiatives. Realistically, the metric of average revenue per visitor will likely come from a non-studio team. But, through partnership, you can still build a great business case about this - showing the studio's impact on revenue through creative (and getting other teams excited about how partnering with the studio can help their own KPIs!).

The most simple and straightforward version of this project would involve identifying a baseline, implementing a new (or improved) cross-merchandising or shop the look initiative, and comparing that against the baseline. Super simple. If you want to drill down a little more specifically, you could introduce A/B tests, or even cross reference the data against your studio management platform, seeing which products you included in this project. For example, let's say your team introduced new styling that included “shop the look” functionality on 300 out of 500 products you shot in June. When reviewing performance in June, you could compare the increase in revenue per visitor where these products were involved, versus where they were not.

In summary, this is less of a studio KPI and more of a shared KPI with other teams, but we've included it because it's a great opportunity to get off the studio “island” and work with other teams on a tangible KPI that lives right at the intersection of commerce and creativity.

A/B Reports on Creative Changes (example: adding PDP Video)

streamline icon business team goal

Leadership Goal/s

Increase revenue

streamline icon business team goal

Studio Goal/s

Increase conversion rates by 0.5% on the top 50 performing products over the next 60 days

streamline icon business team goal

KPI

% difference in conversion rates between products with/without video in the A/B test

streamline icon business team goal

Data Sources

Cross-functional web team reporting

streamline icon business team goal

Dimensions

JThis will likely be provided by an external team

streamline icon business team goal

Metrics

Conversion rate

streamline icon business team goal

Report Type

Numeric (percentage)

A/B Testing and Creative Changes

A/B reports and site testing KPIs are more of a shared goal with cross functional teams, not solely a studio goal - and that's not a bad thing! As we've discussed at length, breaking down those barriers and partnering with other teams is a huge win for a studio. That said, A/B testing and conversion rates reporting and all that juicy webstore data is often separate from the folks creating the content. Let's change that.

I'd recommend building a monthly (or more often) meeting with studio leadership and the teams responsible for testing on your brand's site. Find out what they're curious about testing, and set out to give them the tools to do it. This isn't just being about being a strong partner, it's also about sharing in the “wins” made by photos, videos and content in your studio, created by your team. Often, these teams are left to their own devices, testing with whatever assets have already been produced. Why not be more intentional? Work with your partners to pick an area, create the content to support the test, and see where you land!

A common example of this is adding video to a PDP page. This can be an investment, so obviously you want to test it out. Work with your site team to identify a good, varied group of products to test on. Often, these will be somewhat higher performing products, so that there will be enough data to work with. After all, if you only sell 20 shirts, it's pretty hard to make conclusions with a dataset that tiny. IN our example, let's say the team identifies the top 50 performing products on the site. The studio then pulls samples for these and produces PDP video for them. The site team A/B tests for the next 60 days, reviewing performance for customers who were served video versus those who were not. At the end of the test, you'll be provided conversion rates for both sets. Ideally, you'll see a significant increase from the customers who were served up videos.

It's important to have a seat at the table for these content testing discussions. After all, the studio is the expert on content creation. While an increase in conversion is great, that's just the beginning of the conversation, and the beginning of the work for the studio. Can your studio figure out how to roll those PDP videos into existing productions at a minimal cost? Or a flat cost using the talents of existing staff? Perhaps your studio creatives can even suggest certain types of content they could provide easily, or at low cost, to give the site teams for ammo for testing. The possibilities are endless - but without leaning into the opportunity, your studio is left behind when it comes time to celebrate the victories.

KPI Book cover

Want the E-book?

Our comprehensive guide to KPIs, Reporting, and Dashboards will soon be available as an e-book.
Sign up now and we'll send you a copy as soon as it's released.

Send Me the E-book