top of page

Be the first to know

Leave your e-mail to receive our weekly newsletter and access Ask-Me-Anything sessions exclusive to our subscribers.

Core Concept: Outcome Regression for Causal Inference

  • Writer: Maria Alice Maia
    Maria Alice Maia
  • Nov 11, 2024
  • 2 min read

Your ad campaign "lifted" sales by 10%. Are you sure it was the campaign? Or was it just... November?


This is one of the most common "Kindergarten Data" mistakes I see in Marketing Departments: a simple before-and-after comparison. You look at average sales before the campaign, compare it to the average during the campaign, and give the campaign credit for the entire difference.


This is not just wrong; it's dangerously misleading. You're ignoring seasonality, competitor actions, holidays, supply chain issues—dozens of other factors that could be driving sales.


To get to the real impact, you have to statistically isolate the effect of your campaign from all the other noise. The most straightforward way to do this is with a foundational method from the causal inference toolkit: Outcome Regression.

ree

Think of it like this:

Your sales (Y) are the final song mix. Your ad campaign (the treatment A) is the one instrument you want to hear clearly. All the other factors—seasonality, promotions, competitor moves (the confounders X)—are other instruments playing at the same time.


A simple before-after analysis is like listening to the whole noisy orchestra. Outcome Regression is the mixing board that lets you turn down the volume on everything else so you can hear the clean, isolated track of your ad campaign.


In practice, instead of just comparing averages, you build a model:

Sales = β Ad_Campaign + α1 Seasonality + α2 * Competitor_Discounting + ...


The number you care about is β (beta). That coefficient represents the average treatment effect—the true impact of your campaign, cleansed of the influence of the other factors you included.


But there's a critical catch, and it's a big one. As the research makes clear, this only works

if the model is correctly specified. This means you must have identified and included the right confounders and chosen the right relationships (e.g., linear). Model misspecification is a "significant problem".


This is why just running a regression isn't enough. It must be paired with the rigorous, causal thinking we’ve discussed—using tools like DAGs to map out your assumptions before you build the model.


In every business I’ve led or advised, from consumer goods at Ambev to my own startup, the first step toward a real data-driven culture is moving beyond simple averages and embracing the discipline of regression. It's the baseline for professional analytics.


My mission is to demystify these foundational tools. Regression isn't just for data scientists; it's a way of thinking every leader needs to understand to spot flawed analysis and demand better. This knowledge isn't mine to keep.


If you’re ready to move beyond kindergarten comparisons and build a more robust understanding of your business, join my movement. Subscribe to my email list.


And if you’re trying to isolate the impact of a specific initiative right now, book a 20-minute, no-nonsense consultation with me. Let’s design a better model together.


Stay Ahead of the Curve

Leave your e-mail to receive our weekly newsletter and access Ask-Me-Anything sessions exclusive to our subscribers.

If you prefer to discuss a specific, real world challenge, schedule a 20-minutes consultation call with Maria Alice or one of her business partners.

Looking for Insights on a Specific Topic?

You can navigate between categories on the top of the page, go to the Insights page to see all articles and navigate across all pages, or use the box below to look for your topic of interest.

bottom of page