From 'Lie to Me' to Public Trust: What Neuroscience Taught Me About AI Adoption
- Maria Alice Maia

- Jun 16
- 3 min read

What do users say they want? And what does their subconscious actually crave?
Early in my career, I ran a project playfully named "Lie to Me." The premise was simple: what people say in focus groups is often a poor predictor of what they will actually do in the marketplace. To get to the ground truth, we had to go deeper. We used neuroscience tools—EEG to measure brain activity, eye-tracking to follow visual attention, and IAT to understand unconscious biases —to decode the unfiltered, implicit preferences of consumers. We learned to measure the gap between stated belief and subconscious behavior.
That experience taught me the most important lesson of my career, one that scales from selling a product to deploying a national AI strategy: if you only listen to what people say, you're missing the most important part of the story.
"Doing Data Wrong": The Brilliant AI Nobody Trusts
This brings us to one of the most common failure modes in technology today. A company invests millions developing a technically brilliant AI tool. It’s powerful, accurate, and solves a real problem. They run surveys and focus groups, and the feedback is positive. Users say they "like it" and "find it useful."
The tool launches. And flops.
Engagement is low, adoption stalls, and the project is quietly shelved. What went wrong? The company listened to what users said, but they failed to measure what users felt. They missed the implicit friction: the subtle hesitation, the flicker of confusion in the interface, the gut-level feeling of distrust that prevented users from truly embracing the technology. They measured explicit opinion but ignored implicit rejection.
The Neuroscience of Trust
The "Lie to Me" project proved that you can measure these implicit drivers. By analyzing neurological responses, we could see what truly captured attention, what caused cognitive friction, and what sparked genuine emotional engagement—insights that were often invisible in traditional market research.
This same principle is the key to solving the great challenge of our time: building public trust in AI.
Public trust is not a rational calculation based on a system's technical specifications. It is a deeply emotional, implicit, and behavioral phenomenon. People don't adopt technology because they read a white paper on its accuracy; they adopt it because it feels safe, fair, and reliable. As my current research on the behavioral drivers of AI adoption explores, we must understand the psychology of trust to build systems that people will actually use and support.
A Playbook for Building Implicit Trust
How do we move from measuring subconscious preference for a product to building societal trust in AI? The tools may change, but the framework is the same.
Measure What Matters, Not Just What's Said. You don't need an EEG machine to do this. You need to obsess over user behavior. A/B test tiny changes in your interface to see what reduces hesitation. Use session recordings to identify where users get confused. Measure drop-off rates, not just survey scores. These are all proxies for the implicit friction that kills adoption.
Design for Emotional Safety. Trust is an emotion, and it is built through design. Is your interface clear and predictable? Does it give the user a sense of agency and control? Does it communicate uncertainty honestly? Every design choice is an opportunity to either build or erode that subconscious feeling of safety.
Make Trust the Product. The biggest mistake is treating trust as a feature to be added later or a PR issue to be managed. Trust is the product. For any AI system, especially in high-stakes domains like healthcare or finance, the primary job is to earn the user's implicit confidence. The technical function is secondary.
The insights from neuroscience are clear: the human brain is a complex machine that makes decisions based on a mix of conscious reasoning and powerful, subconscious heuristics. To build technology that succeeds in the real world, we must learn to decode and respect both.
What users say and what they do are often different. Understand the subconscious drivers of technology adoption by joining my newsletter. If you'd like to discuss your specific concerns, begin by scheduling a 20-minute consultation call.


