top of page

Could You Increase Your Ad Revenue by Doing More Creative Testing? Find Out With Regression Analysis



I’ve been running digital ad campaigns for over 10 years now and have managed ad budgets for companies spending more than $5M/month and others spending as low as $5k/month. What can businesses in both ranges of the spectrum have in common? A constant need to produce new creative and A/B test new creative. Unfortunately, both sizes of businesses can sometimes lack the motivation to put time and budget aside for developing and testing ad creatives.


If you’ve been running ads and continually receive similar results, you can try using this technique below to estimate how much revenue you may be losing out on by not investing in new creative regularly. Because well, what’s more motivating to inspire change than realizing you’re possibly leaving money off the table?



*Above is a scatter chart with a trendline to visually demonstrate the relationship between the real-world variables I’ve used in this article.


Before we begin, you’re going to require a few things:


  • A Meta (Facebook/Instagram) Ads account with consistent monthly ad spend data to analyze.


  • Multiple versions of previous ad creative with differing results to analyze.


  • Clean, audience-segmented ad campaigns.


  • Intermediate-level experience with Microsoft Excel or Google Sheets.


  • Roughly 20-40 minutes of focused time.



To do this, we’re going to use regression analysis. What is regression analysis? Here’s a quick refresher below:


According to Wikipedia, regression analysis is


“a set of statistical processes for estimating the

relationships between a dependent variable

(often called the 'outcome' or 'response' variable,

or a 'label' in machine learning

parlance) and one or more independent

variables (often called 'predictors', '

covariates', 'explanatory variables' or '

features').”


For us digital marketers, you can think of it like this:


We usually look at the same metrics to review ad creative performance: thumb stop rate, click-through rate, average cost per click, impressions until conversion, conversion rate, add to cart rate, and so on depending on your objectives. Well, what if you could determine which of those metrics statistically shows an impact on predicting the other?


This can help determine which metrics to focus on and what the potential impact may be from a monetary perspective on how hard you should try on developing new ad creative. In this example, the two variables we’re going to use are unique outbound click-through rate and cost-per unique outbound click.


*Note, the data set below is based on the last 30 days of performance from an ad account that my agency manages. All of the data was exported at the ad name level from Meta Ads Manager from one campaign that tested 9 different creatives (e.g. which explains the 9 rows of data) over the 30-day period. This particular campaign is designed for driving top-of-funnel (acquisition) purchases on a Shopify Plus store and therefore it excludes all website purchasers, remarketing users and social media engagers. Total ad spend for the 30-days in this particular campaign was slightly over $25k USD. Due to the consistency in audience, objective and time frame while testing enough creative types, we were able to see a strong correlation in the data for this example.


Now that we have our ad data in columns A and B, we can use the LINEST formula to calculate our outputs. In cell D2, I’ve written:


=LINEST(A2:A10,B2:B10,TRUE,TRUE)



How to interpret and use this data?


Mainly, we want to look at the coefficient of determination (R Square) which is a number between 0 and 1 that measures how well a statistical model predicts an outcome. Here are some guidelines to help you interpret your ranges:


  • 0-0.10 indicates that there is very weak to no correlation and the model does not explain changes


  • 0.10-0.70 indicates weak to medium correlation


  • 0.70-1 indicates that there is a strong correlation between the dependent and independent variables


  • A value of 1 indicates that all changes to the dependent variable can be determined by the independent variable


With our coefficient of determination being 0.75, we can see that roughly 75% of the variation in the cost per unique outbound click can be explained by the unique outbound click-through rate, therefore proving that as CTR increases, CPC decreases. Meta Ads typically charge by impression, so this should not be a huge surprise if you’re running enough ad spend.


Now, here’s the interesting part. For future creative iterations, if we’d like to set a benchmark for CTR, say we want to increase our average CTR from 1.03% to 1.20%, we can use the FORECAST formula to determine what we could expect to pay for CPC by typing:


=FORECAST(0.012,B2:B10,A2:A10)



In the forecast above, we’ve projected out how many incremental unique outbound clicks could be generated simply by designing new and hopefully, improved, creative that increases our average outbound click-through rate and keeping ad spend the same. In this example, the average conversion rate for this top-of-funnel purchase campaign is 2.94%, therefore the store could expect 39 additional orders each month. Assuming an average order value of $80 for round, anonymous numbers, they could expect roughly $3,120 in incremental revenue each month.


Now if you’re able to learn from your best performing creative and have a graphic designer or video editor make a few iterations to test for less than that, it’s very possible to have impactful design changes made for as low as $300-$500, you can enjoy increasing both your top and bottom line.


I hope you found this helpful and it inspires you to continually test and improve on your ad creative. I’ll be releasing future articles on how to identify what to test in your ad creative and the top tests that drive impact to your ad creative, so stay tuned for those.

Build a winning growth strategy.

Free digital marketing insights by our CEO.

Never miss an update

Thanks for submitting!

bottom of page