Measuring Video Performance: Four Common Pitfalls

Given the important role video now plays in marketing and internal communications, you’d think most organizations would have well-thought-out strategies for measuring video performance. Unfortunately, this isn’t always the case. In fact, I’m often surprised by the lack of consideration put into both deployment strategy and video analytics. Here are the mistakes I see most often -as well as how they can be remedied:

Pitfall 1: Having no Strategy for Measuring Performance

This happens more often than most organizations would like to admit. A lot of thought and effort will go into a video’s messaging or business application. However, once the video is sent out into the world, little effort will be made to monitor how it actually performs. Unfortunately, failing to test video performance is a big missed opportunity. Not only will the organization fail to learn about the efficacy of the messaging or the audience response to it, they will be challenged to determine whether the video was actually worth the investment.
 
Fortunately, as long as you know what you want to achieve with a video, building a simple performance measurement strategy is not hard to do. From gathering data on the hosting platform (or content management system) and conducting audience testing to monitoring desired outcomes (sales, behavior change, etc.), there are quick and affordable ways to at least get a glimpse of how a video is ultimately performing. And though it’s not always straightforward (calculating value for things such as social media likes or an improved employee retention rate can be challenging), monitoring video performance will be the best way to prove to stakeholders that your video was worth the investment. Again, having a rough idea of performance is always better than none at all!
 

Pitfall 2: Aiming at the Wrong Target

I was recently on a call where a client cheerfully related to their CEO how many thousand total plays a video had already generated. They then raved about the high play rate and low amount of audience drop off. And these are all good things in themselves. However, the problem was that none of these metrics had much to do with the ultimate goal of the video, which was designed to shift public opinion about a controversial piece of legislation. In short, they simply pulled from the most easily available metrics instead of creating performance indicators that better showed how the video was actually measuring up its specific goals.
 
While easy to find ‘legacy’ metrics like views and play rate can be highly insightful, organizations need to define their own metrics and key performance indicators. For example, if a video is designed to drive traffic to a landing page, you need to find a way to measure just that. It doesn’t really matter if everyone watched the whole video if they aren’t taking the desired action.
 
Fortunately, creating your own metrics doesn’t have to be hard. It starts with identifying the specific outcomes you want to see. Working backwards, you then consider what performance indicators and measurement mechanisms will demonstrate how successful you are in achieving these outcomes. For example, working closely with a large employer on a set of videos to supplement an onboarding program, we identified several targets we wanted to hit, such as driving up readership of the employee handbook, reducing travel policy infractions, and increasing 2-year employee retention rates. From here, we developed specific metrics and testing strategies for each goal, including surveys and even a short online quiz. As a result of the targeted metrics and testing insights, we were able to adjust the video messaging and deployment strategy until we eventually reached our goals. Moreover, because we calculated a monetary value for each outcome, we were able to demonstrate a significant return on investment to leadership.
 

Pitfall 3: Focusing on a Single Metric

Eventually, most organizations do land on a favorite metric – whether that be total plays, conversion rate, or the number of 20 year-olds sharing a video on TikTok. And this isn’t bad in itself. However, focusing on a single metric can blind you from seeing other important information that could help you make the video or campaign perform even better.
 
Put simply, video campaigns are much more likely to be successful when you use a mix of metrics to get different views on how the content is performing. Specifically, we recommend a mix of both leading and lagging indicators. Leading indicators are metrics that look forward, hinting at how you might be progressing against your goals. For example, low video click-through rates are a leading indicator that conversion rates or information retention might eventually fall short of your goals. Likewise, a high engagement rate could be a leading indicator that conversions or product sales will be higher than expected. Either way, leading indicators will allow you to shift strategies, tweak video messaging, or prepare alternative solutions before it is too late. Conversely, lagging indicators measure current performance or how well the video actually did against your goals. In other words, they typically look back on how things turned out. Total impressions, quarterly sales or total employee turnover are common examples of lagging indicators. In short, both leading and lagging indicators are important to having a wider view of video performance.
 

Pitfall 4: Relying on Anecdotal Evidence

 
Over the years, I can’t even count how many times I’ve asked clients how their videos were performing, and they replied with something anecdotal like, “Our CEO just loved it,” or “The several customers I shared it with all thought the messaging was spot-on.” Though it’s always good to hear performance anecdotes, this should never be used as a substitute for a measurement strategy. Like small sample size and regional or cultural biases, relying on anecdotal evidence can lead us astray and keep our videos and campaigns from reaching their full potential. You need to go deeper.
 
As much as your resources allow, we always recommend collecting as much data and audience research as possible. Likewise, try to balance any qualitative audience research (e.g. focus groups) with data or testing that is more quantitative. Thanks to A/B testing services like Biteable and Maze, this has never been easier.
 
 

Want to discuss your video performance testing or deployment strategy? Build a video together?

Reach out: seth@splainers.com

Ready to bring your story to life?