The common wisdom for AI adoption is to start small with isolated proofs-of-concept. But a bolder strategy is gaining traction, one that argues for tackling complex, cross-functional challenges from the start. Now, the urgency is palpable. In just a few weeks, the primary question from executives has shifted from "What is it?" to "How do I do it?"

Going big from the start drives faster buy-in, shows a clearer return on investment, and builds the foundational knowledge layer for future orchestration. But the approach hinges on a counterintuitive idea: messy data is not the blocker most think it is.

We spoke with Chris McDivitt, a Managing Director at Accenture and the Global Lead for their Autonomous Supply Chain Capability. With over 30 years of experience transforming complex operations at giants like UPS and Capgemini, his strategy challenges decades of IT convention. McDivitt believes the vision is the only way for large organizations to escape pilot purgatory and unlock the true potential of AI.

  • The perfect data myth: McDivitt’s first move is to challenge the belief that a massive, multi-year data cleansing program is a prerequisite for AI. A painful leftover from the ERP era, he said, the fear of imperfect data is a mindset holding businesses back. "Don't think of this as a massive, three-year data program. It's about future-proofing the enterprise for an inevitable cascade of technologies, from physical AI, then humanoids, and then quantum."

For CIOs worried about another massive data project, McDivitt offered a counterintuitive solution: "The AI itself can help construct the foundation it runs on. You can use agents as builders of your data products. They can do validation steps, and they can work on metadata."

  • A mesh architecture: The approach centers on building a modern data layer through a data mesh, an architectural shift that allows organizations to productize their data. The mesh architecture shatters the long-dominant data lake model, whose limitations McDivitt explained with a simple analogy. "A data lake is like a pantry full of ingredients where you don't understand the relationships. A data mesh allows you to actually create a recipe," he said.

  • The knowledge graph: But the mesh is only part of the solution. The crucial next layer is a knowledge graph that automates the contextualization of data products. The knowledge graph is where the digital brain truly comes to life, by encoding the invaluable tribal knowledge of an organization—the expertise of its best planners and operators. The move represents a fundamental pivot, not an add-on. As McDivitt put it, “You can't just take an agentic architecture and stitch it onto an S4 ERP platform. It doesn't work like that.”