
Key Points
- Chris McDivitt, Managing Director at Accenture, challenged traditional data cleansing practice. Instead, in a conversation with CIO News, he advocated for tackling complex AI projects from the start.
- He suggested using AI to build foundational data layers, promoting a data mesh and knowledge graph for better data contextualization, and a 16-week MVP to solve real business problems and secure funding for larger transformations.
- The ultimate goal is to create a human-AI learning loop to enhance decision-making and adapt to increasing disruptions, according to McDivitt.
The common wisdom for AI adoption is to start small with isolated proofs-of-concept. But a bolder strategy is gaining traction, one that argues for tackling complex, cross-functional challenges from the start. Now, the urgency is palpable. In just a few weeks, the primary question from executives has shifted from "What is it?" to "How do I do it?"
Going big from the start drives faster buy-in, shows a clearer return on investment, and builds the foundational knowledge layer for future orchestration. But the approach hinges on a counterintuitive idea: messy data is not the blocker most think it is.
We spoke with Chris McDivitt, a Managing Director at Accenture and the Global Lead for their Autonomous Supply Chain Capability. With over 30 years of experience transforming complex operations at giants like UPS and Capgemini, his strategy challenges decades of IT convention. McDivitt believes the vision is the only way for large organizations to escape pilot purgatory and unlock the true potential of AI.
The perfect data myth: McDivitt’s first move is to challenge the belief that a massive, multi-year data cleansing program is a prerequisite for AI. A painful leftover from the ERP era, he said, the fear of imperfect data is a mindset holding businesses back. "Don't think of this as a massive, three-year data program. It's about future-proofing the enterprise for an inevitable cascade of technologies, from physical AI, then humanoids, and then quantum."
For CIOs worried about another massive data project, McDivitt offered a counterintuitive solution: "The AI itself can help construct the foundation it runs on. You can use agents as builders of your data products. They can do validation steps, and they can work on metadata."
A mesh architecture: The approach centers on building a modern data layer through a data mesh, an architectural shift that allows organizations to productize their data. The mesh architecture shatters the long-dominant data lake model, whose limitations McDivitt explained with a simple analogy. "A data lake is like a pantry full of ingredients where you don't understand the relationships. A data mesh allows you to actually create a recipe," he said.
The knowledge graph: But the mesh is only part of the solution. The crucial next layer is a knowledge graph that automates the contextualization of data products. The knowledge graph is where the digital brain truly comes to life, by encoding the invaluable tribal knowledge of an organization—the expertise of its best planners and operators. The move represents a fundamental pivot, not an add-on. As McDivitt put it, “You can't just take an agentic architecture and stitch it onto an S4 ERP platform. It doesn't work like that.”
The entire shift boils down to a simple mantra: bringing AI to the data, and not data to the AI. The new operating model, which he compared to Uber, splits responsibilities: the business owns the data's content (the roads and cars), while IT owns the platform and its rules (the ride-sharing app and its policies). But the grand vision also raises the inevitable executive question: How do we start without an open checkbook?
Two paths to avoid: McDivitt’s advice is to reject two common but flawed paths: the slow, bureaucratic multi-year business case and the trivial, dead-end proof-of-concept. Instead, he contrasts his approach with failed legacy efforts like control towers, arguing they were incomplete solutions. He said, "They gave you visibility to an issue, but then they fell short. They didn't tell you the impact, the root cause, or the next best action."
To avoid these mistakes, he advised leaders to adopt a mega use case: a complex, cross-functional project that builds an integrated, modern solution from the start. To de-risk the ambitious scope, package it into a tangible, time-bound project, he advised.
The 16-Week MVP: The MVP solves a real business problem while building the first pillars of a modern data framework. McDivitt said, "We advise clients to take a mega process like plan-to-execute and deliver a 16- to 20-week MVP. Within that, you define an agentic workforce where an orchestrator agent drives an outcome like On-Time In-Full." An orchestrator, in turn, directs more specialized super agents and utility agents. Deliver a tangible win on a complex problem, and, "Go get the big transformation dollars for the full vision," McDivitt advised.
The ultimate goal is to create a virtuous cycle between humans and AI, a learning loop that generates compounding value. He urged leaders to shift the value proposition from simple efficiency metrics like less FTEs toward speed, highlighting the power of compressing the latency of decision-making. "The real draw is the opportunity to understand the human-AI learning loop, how you build out a knowledge layer, and how you modernize your data architecture with that layer to allow for orchestration on what I'll call a mega process in your value chain."
Ultimately, McDivitt framed the transformation as a response to a function in crisis. The sense of urgency stems from the belief that in an environment of increasing disruption, the old ways of managing systems are no longer viable. But in the face of such complexity, human capability alone is simply not enough, he concluded. "The only way out is to release the power of AI and start making more autonomous decisions."





