Tech Pros: Don't Just Report Averages, Show the Dynamic Impact of Your Changes
- Maria Alice Maia

- Feb 10
- 3 min read
Updated: Jul 15
To every data scientist and product analyst: You launched a new feature. You ran a perfect A/B test. You found a 2% lift in engagement after one week, declared victory, and moved on.
But what happened in week 2? Or month 2?
This is a subtle but critical form of "Doing Data Wrong": measuring a single, static "average effect" for a product or feature whose impact is almost certainly dynamic. You're taking a snapshot of a moving train and calling it the whole journey.

Let's take a Consumer Goods company with a mobile app.
The "Average Effect" Illusion
The Scenario: A CPG company launches a new "recipe finder" feature in their app to boost engagement and, ultimately, drive sales of their ingredients. They run a clean A/B test.
The Flawed Analysis: After one week, they measure the average effect. It's a small, positive lift. The product manager, looking at this single number, might conclude the feature is a minor success but not worth further investment, or even kill it for not showing a huge immediate ROI.
This is a failure of analysis because it ignores the story of user behavior over time. The question isn't "What was the effect?" The real question is "What is the shape of the effect over time?"
The Right Way: Show the Dynamic Impact Instead of one number, you should show the Event Study Plot. This is a simple but powerful graph that plots the difference between your treatment and control groups for each day or week before and after the launch.
This visual narrative can reveal completely different stories, each with different strategic implications:
The Novelty Effect: A huge spike in engagement in the first few days that quickly fades as the novelty wears off. Strategic lesson: The feature is a great acquisition tool, but not a long-term retainer. You need to keep innovating.
The Learning Effect: A small or even zero initial effect that grows steadily over weeks as users discover the feature, learn how to use it, and integrate it into their habits. Strategic lesson: The feature is a potential long-term asset. You should invest in user onboarding and education to accelerate the value curve.
A single "average effect" hides all of this. It collapses the rich story of user adaptation into one, often misleading, number. An analysis that showed a small "average" lift might be hiding a powerful learning effect that is your next big growth driver.
As a leader who has launched and scaled digital products at Ambev, FALCONI, and my own startup NaHora.com, I never wanted a single number. I wanted the story. Is this a gimmick or a habit-former? The dynamic effect plot is what tells you.
The Playbook:
Tech & Product Pros: Your job isn't to report a single τ. It's to tell the story of the treatment effect over time. Make event study plots a non-negotiable part of your A/B test analysis. This is how you elevate your insights from tactical to strategic.
Managers: Stop asking "What was the average lift?" Start asking, "Can you show me the plot of the effect over time?" Demand the narrative, not just the number.
My mission is to help teams bridge this gap between simple reporting and strategic analysis. This knowledge isn't mine to keep.
If you’re ready to tell more powerful, dynamic stories with your data, join my movement. Subscribe to my email list for more no-nonsense, research-backed insights.
And if you’re trying to understand the true long-term impact of a recent launch, book a 20-minute, no-nonsense consultation with me. Let's find the story in your data.


