“Data-driven” has become shorthand for smart, modern, rigorous. The opposite of gut decisions. The antidote to opinion-based leadership.
And yet, we’ve seen data-driven companies make terrible decisions. Not because they ignored the data, but because they followed it—right off a cliff.
The Data Worship Problem
Data doesn’t interpret itself. Every metric is a simplification of reality, capturing some aspects while ignoring others. Every analysis involves choices about what to measure, how to segment, which time period to examine.
These choices embed assumptions. And when the assumptions are wrong, the data leads you astray—confidently, convincingly, wrong.
Common Ways Data Misleads
Measuring the measurable. We optimize what we can measure, even when it’s not what matters most. Clicks are easier to count than brand perception. Conversion rate is simpler than customer satisfaction. But the easy metrics aren’t always the important ones.
Averaging away insight. Aggregate data hides segmentation that might change everything. Your overall conversion rate is fine, but one customer segment is thriving while another is failing. The average tells you nothing useful.
Survivorship bias. You’re analyzing customers who converted, trying to understand what works. But you’re not seeing the customers who left, who might reveal where you’re actually failing.
Short-term over long-term. Most analytics measure immediate outcomes. But some tactics that boost short-term conversion damage long-term brand value or customer lifetime value. The data shows a win; reality shows a loss.
Correlation as causation. Two things moved together. But did one cause the other? Or did something else cause both? Or is it coincidence? Data shows correlation; judgment determines causation.
The Interpretation Layer
Between data and decision is interpretation. This is where human judgment—the thing data was supposed to replace—remains essential.
What does this metric actually measure? What doesn’t it capture? What assumptions are embedded in how it’s calculated? What alternative explanations exist for what we’re seeing?
These questions require thinking, not just analysis. They require understanding the business, the customers, the context—things that don’t fit in a dashboard.
When to Trust Your Gut
“Data-driven” doesn’t mean “data-only.” There are times when judgment should override what the numbers suggest:
When the data is thin. Small sample sizes, short time periods, limited segments—these produce unreliable conclusions dressed in false precision.
When the stakes are high. Strategic decisions about brand positioning or customer experience shouldn’t be made solely on A/B test results.
When something feels wrong. If the data says one thing but experienced people in your organization feel uneasy, that unease is information too. Don’t dismiss it just because it’s not quantified.
A Healthier Relationship
The goal isn’t to abandon data. It’s to use it appropriately—as one input among several, not as the final arbiter of truth.
Combine quantitative data with qualitative research. Talk to customers, not just about customers. Understand the story behind the numbers.
Be explicit about assumptions and limitations. Every analysis should come with caveats, not just conclusions.
Stay curious about what the data might be missing. The most important dynamics might be the ones your metrics don’t capture.
The Humble Approach
The best data practitioners we know share a quality: humility. They know what data can and can’t tell them. They hold conclusions loosely. They seek disconfirming evidence.
This isn’t weakness—it’s wisdom. Data is a tool, and like any tool, it works best in skilled hands that understand its limitations.