The Myth of the Data-Driven Organization
Too many marketing orgs brag about being "data-driven" when they should focus on being "insights-driven," where applied human judgment informed by data drives the culture.
"Great data, but what did you learn from it? What are you going to different tomorrow? Next month? Next campaign? How will this shape our behaviors?"
This is my default reaction to seeing yet another marketing dashboard, and thanks to the ease of creating them through new tools and certainly AI, we all see a lot of increasingly professional and visually appealing dashboards filled with impressive data points, trend/sparklines, deltas-vs-target, etc. But data without insight attached is just a pile of numbers that gives the illusion of knowledge.
This may boil down to a question of semantics, but the words we used are important ways to send cultural signals to our teams.
"Data-Driven" is Not "Insights-Driven"
Every marketing organization I've worked in or consulted with describes itself as "data-driven." It's simply table stakes language in job postings, investor decks, board presentations, and team culture documents. I know I certainly have used the term in my own "how I lead" presentations over the years. And to be fair, most teams are doing something with data. They have dashboards, they run attribution models, they A/B test, they track conversion rates and pipeline velocity and customer acquisition costs with genuine rigor with a commitment to accountability for marketing return-on-investment (ROI).
But here's a reality that I'm confident many marketers, at all levels, will identify with: in many marketing organizations, the data doesn't actually drive the decisions. The decisions get made the way they've always been made, through a combination of executive instinct, organizational politics, legacy strategy or commitments, and whoever the loudest voice in the room might be. Only then is the data gets applied after the fact to justify decisions or at least provide some level of false aircover.
"If you torture the data long enough, it will confess.”
- Ronald H. Coase
If you've been in the profession long enough you will recognize the pattern. The weekly dashboard review where everyone simply reports out the numbers, good and bad. The beautiful dashboard visuals that everyone has bookmarked (or sit on a monitor in the team office space) but no one really pays much attention to. The attribution model that the analytics team knows is flawed but nobody really challenges because it tells a convenient story about which channels "deserve" budget. The A/B test that gets designed to confirm a decision an influential stakeholder already made in their head. The quarterly business review (or Board meeting) where the slides are packed with carefully stage-managed metrics but the actual conversation that matters happens in a hallway fifteen minutes later.
This is the theater of data-driven decision-making: The data exists, the collection ingrastructure is humming along, the reports get generated, the fancy new AI tool spins up every manifestation of a dashboard the ELT could dream of. But the actual decisions are still being made on gut, politics, or simply inertia, just dressed up in a data veneer that makes everyone feel more rigorous than they actually are.
I'm not saying this to be cynical. The people involved are usually well-intentioned, smart marketers. They genuinely believe they're being data-driven. But there's a meaningful gap between collecting/displaying data and intentionally tapping data to inform how you think, and it's as much cultural as it is related to tools, talent, and infrastructure.
Why the Gap Between Data and Insights Persists
The typical consulting answer here is "you need to build a data culture." Which is certainly a valid and reasonable statement, and a great way to rack up consulting hours to be sure, but also vague to the point of uselessness. The actual reasons this gap persists are part human nature, part team culture, and part technology.
Human Nature: aka "the incentive problem." In most organizations, being wrong with data is more career-safe than being right on instinct. If you make a bold call based on market intuition and it doesn't work out, you own that failure personally. If you make a conservative call backed by a 40-slide analytics deck and it doesn't work out, well, the data was misleading, the market shifted, the model had limitations. Data provides organizational cover (much like offloading decision-risk to wildly expensive management consulting firms does).
Many marketers and certainly leaders learn, consciously or not, to collect data defensively, as a survival tool, rather than use it decisively and proactively to improve the business. The result is an organization that by all outward appearanced looks rigorous but is actually risk-averse, which is a very different thing.
"Friends and family I trust, everyone else bring data."
- Microsoft colleague from the early days of my career
Technology: aka "the more martech isn't always a good thing problem." Modern martech and in particular AI have made it trivially easy to generate more data - and more dashboards - than any human or team can meaningfully process. The reality is that the bottleneck in marketing was never access to data, or at least that's been the case over the past few years. The bottleneck is the human judgment to know which data matters and which is noise. More data often makes this harder, not easier, because it creates the illusion of comprehensiveness. When you have a dashboard with 47 metrics, you can always find one that supports whatever story you want to tell. Or, alternatively, a dashboard with 47 metrics simply becomes a shiny brag piece that gets widely ignored because it takes too much cognitive load to derive meaning from it all.
To be fair, as much as AI is part of the problem, it can also be part of the solution: Taking a pile of metrics and using AI to filter, sort, and surface meaningful insights is a very common marketing AI workload. But it takes leadership and discipline to avoid the temptation to add more data churn to the pile in the hopes AI will sort it out down the line.
Culture: aka "the data over judgement problem." Sometimes the culture creates or at least reinforces the gap, and while this can certainly be coming from marketing leadership, often it's sympotmatic of a wider company culture issue. One where the scorecard rules, and overules, all. Data becomes a proxy for performance, a replacement for judgment, and a disincentive for innovation, risk-taking, and creativity. This is a complex topic, so I will save a deeper exploration of it for a separate post.
What Being "Insight-Driven" Looks Like
I've started using different language with my teams over the years: we're not "data-driven" but "insight-driven." It's a subtle shift in wording but a meaningful shift in mindset and team culture. Data-driven implies we go where the data tells us. Insight-driven means the marketer makes the decision using their own judgement and experience, with data as one critical input alongside market signals, competitive intelligence, team capabilities, creative inspiraton, and the accrued judgement that comes from years of experience.
I tell every team I lead that I'm not so much concerned about what the data point says, rather what we are learning from it. What insights we are gleaning, and how we are combining that with everything else to adjust our strategy, tactics, spend, and behaviors.
What does this look like in practice for a marketing organization?
It looks like marketers at all levels that distinguish between "measurable" and "important" and understand that these are not the same thing. Some of the most important things in marketing, brand perception, competitive positioning, trust, creative quality, are genuinely difficult to measure with any kind of precision or meaning. The fact that something is hard to quantify doesn't make it less real or less important to the business. Conversely, simply the fact that you can measure something with six decimal places doesn't make it significant, meaningful, or useful to generating real insight.
It looks like leaders who know how to sift through the mass of data to elevate focus on the data points that matter for the team and the wider organization. And those same leaders knowing both how to translate those data points into insights and how to communicate the meaning and value of those insights to the CEO, Board and their peers in the C-suite.
It looks like campaign and project retrospectives (AARs, post-mortems, etc.) that deeply examine the intent and quality of decisions and not just outcomes. A good decision can produce a bad outcome, and a bad decision can produce a good outcome (aka luck). If you only focus on outcomes - the number on the scorecard - you devalue or even disincentive creative judgment and intentional risk-taking. The most sophisticated marketing organizations I've seen evaluate the quality of the thinking, the rigor, the intent, that went into a decision separately from whether it worked out.
And it looks like leaders and marketers who are comfortable saying "the data is inconclusive or incomplete, and here's what I think we should do based on our collective judgement," operating in a culture that supports and celebrates taking action even in the face of imperfect data.
The AI Twist
I mentioned AI earlier, and how it can both complicate and simplify this challenge pretty much at the same time. AI is about to make the "data-driven" approach even more alluring and even more dangerous. AI-powered analytics tools can now build dashboards, generate analyses, surface patterns, and present recommendations with an apparent confidence that makes their prompt results look and feel deeply compelling. An AI-generated recommendation can arrive fully formed, backed by aggregated or synthesized data, written with an authoritative voice, and yet be completely wrong about what your team should actually do.
The fact it only takes a few quick prompts to generate these recommendations, or to spin up new dashboards and analytics, simply compounds the risk here.
This connects to something I've written about before: the skills that matter most in an AI-augmented world are the deeply human ones. Judgment, experience, insight, creativity, empathy, pattern recognition when the data is ambiguous. These human skills are needed more than ever in a business world where the problem has shifted from decision-making paralysis due to insufficient data to outsourcing decisions to overly-confident AI outputs that might be horribly inaccurate.
In an AI world, being "insight-driven", with human judgement playing a major role in generating and applying those insights to the work, is more important than ever.
The TL;DR
"Data-driven" has become table-stakes in most discussions around org culture, marketing team charters, and CMO job descriptions. It's used as shorthand for being fluent in analytics and incorporating data into our decision-making and accountability processes. Through this lens, "data-driven" a good thing.
But I'd argue "insights-driven" is a better way to express this same intent: it also reinforces the critical role of human insight and judgement in a world increasingly awash in data, reports, and dashboards. Just as critically, it also shifts the focus from "what happened" to "why it happened, what did we learn, and how is it going to change our behavior," which is ultimately the value of all that data in the first place.