Need smarter insights in your inbox? Join our weekly newsletters to get solely what issues to enterprise AI, knowledge, and safety leaders. Subscribe Now
Up to now decade, corporations have spent billions on knowledge infrastructure. Petabyte-scale warehouses. Actual-time pipelines. Machine studying (ML) platforms.
And but — ask your operations lead why churn elevated final week, and also you’ll seemingly get three conflicting dashboards. Ask finance to reconcile efficiency throughout attribution programs, and also you’ll hear, “It is determined by who you ask.”
In a world drowning in dashboards, one reality retains surfacing: Knowledge isn’t the issue — product pondering is.
The quiet collapse of “data-as-a-service”
For years, knowledge groups operated like inner consultancies — reactive, ticket-based, hero-driven. This “data-as-a-service” (DaaS) mannequin was high-quality when knowledge requests had been small and stakes had been low. However as corporations turned “data-driven,” this mannequin fractured underneath the burden of its personal success.
Take Airbnb. Earlier than the launch of its metrics platform, product, finance and ops groups pulled their very own variations of metrics like:
- Nights booked
- Lively consumer
- Accessible itemizing
Even easy KPIs diverse by filters, sources and who was asking. In management critiques, totally different groups offered totally different numbers — leading to arguments over whose metric was “right” slightly than what motion to take.
These aren’t know-how failures. They’re product failures.
The results
- Knowledge mistrust: Analysts are second-guessed. Dashboards are deserted.
- Human routers: Knowledge scientists spend extra time explaining discrepancies than producing insights.
- Redundant pipelines: Engineers rebuild related datasets throughout groups.
- Resolution drag: Leaders delay or ignore motion on account of inconsistent inputs.
As a result of knowledge belief is a product downside, not a technical one
Most knowledge leaders assume they’ve an information high quality challenge. However look nearer, and also you’ll discover a knowledge belief challenge:
- Your experimentation platform says a characteristic hurts retention — however product leaders don’t consider it.
- Ops sees a dashboard that contradicts their lived expertise.
- Two groups use the identical metric identify, however totally different logic.
The pipelines are working. The SQL is sound. However nobody trusts the outputs.
It is a product failure, not an engineering one. As a result of the programs weren’t designed for usability, interpretability or decision-making.
Enter: The information product supervisor
A brand new position has emerged throughout high corporations — the information product supervisor (DPM). In contrast to generalist PMs, DPMs function throughout brittle, invisible, cross-functional terrain. Their job isn’t to ship dashboards. It’s to make sure the correct folks have the correct perception on the proper time to decide.
However DPMs don’t cease at piping knowledge into dashboards or curating tables. The perfect ones go additional: They ask, “Is that this really serving to somebody do their job higher?” They outline success not when it comes to outputs, however outcomes. Not “Was this shipped?” however “Did this materially enhance somebody’s workflow or choice high quality?”
In follow, this implies:
- Don’t simply outline customers; observe them. Ask how they consider the product works. Sit beside them. Your job isn’t to ship a dataset — it’s to make your buyer more practical. Meaning deeply understanding how the product matches into the real-world context of their work.
- Personal canonical metrics and deal with them like APIs — versioned, documented, ruled — and guarantee they’re tied to consequential selections like $10 million price range unlocks or go/no-go product launches.
- Construct inner interfaces — like characteristic shops and clear room APIs — not as infrastructure, however as actual merchandise with contracts, SLAs, customers and suggestions loops.
- Say no to tasks that really feel refined however don’t matter. An information pipeline that no crew makes use of is technical debt, not progress.
- Design for sturdiness. Many knowledge merchandise fail not from dangerous modeling, however from brittle programs: undocumented logic, flaky pipelines, shadow possession. Construct with the belief that your future self — or your alternative — will thanks.
- Clear up horizontally. In contrast to domain-specific PMs, DPMs should always zoom out. One crew’s lifetime worth (LTV) logic is one other crew’s price range enter. A seemingly minor metric replace can have second-order penalties throughout advertising, finance and operations. Stewarding that complexity is the job.
At corporations, DPMs are quietly redefining how inner knowledge programs are constructed, ruled and adopted. They aren’t there to scrub knowledge. They’re there to make organizations consider in it once more.
Why it took so lengthy
For years, we mistook exercise for progress. Knowledge engineers constructed pipelines. Scientists constructed fashions. Analysts constructed dashboards. However nobody requested: “Will this perception really change a enterprise choice?” Or worse: We requested, however nobody owned the reply.
As a result of government selections at the moment are data-mediated
In at present’s enterprise, almost each main choice — price range shifts, new launches, org restructures — passes by an information layer first. However these layers are sometimes unowned:
- The metric model used final quarter has modified — however nobody is aware of when or why.
- Experimentation logic differs throughout groups.
- Attribution fashions contradict one another, every with believable logic.
DPMs don’t personal the choice — they personal the interface that makes the choice legible.
DPMs make sure that metrics are interpretable, assumptions are clear and instruments are aligned to actual workflows. With out them, choice paralysis turns into the norm.
Why this position will speed up within the AI period
AI gained’t exchange DPMs. It’ll make them important:
- 80% of AI venture effort nonetheless goes to knowledge readiness (Forrester).
- As massive language fashions (LLMs) scale, the price of rubbish inputs compounds. AI doesn’t repair dangerous knowledge — it amplifies it.
- Regulatory stress (the EU AI Act, the California Client Privateness Act) is pushing orgs to deal with inner knowledge programs with product rigor.
DPMs aren’t visitors coordinators. They’re the architects of belief, interpretability, and accountable AI foundations.
So what now?
In the event you’re a CPO, CTO or head of information, ask:
- Who owns the information programs that energy our largest selections?
- Are our inner APIs and metrics versioned, discoverable and ruled?
- Do we all know which knowledge merchandise are adopted — and that are quietly undermining belief?
In the event you can’t reply clearly, you don’t want extra dashboards.
You want an information product supervisor.
Seojoon Oh is an information product supervisor at Uber.