Key Points

  • Agentic AI breaks down when fragmented workflows, siloed knowledge, and inconsistent data turn action-taking systems into operational and security risks.

  • Jason Andrews, Vice President of Strategy and Planning for Engineering Operations at Cisco, described how inconsistent human work habits prevent AI from moving reliably from insight to execution.

  • Standardizing workflows, consolidating systems in the cloud, and governing data quality create the foundation AI agents need to act safely, scale productivity, and remain resilient over time.

Agentic AI succeeds or fails based on the quality and consistency of the processes and data beneath it. As organizations move into action-taking systems, fragmented workflows and informal knowledge stop being inefficiencies and start becoming failure points. For CIOs, data readiness is now an operational priority, because these systems don’t just analyze work. They act on it, forcing a redefinition of infrastructure around how work is structured and governed.

Jason Andrews is Vice President of Strategy and Planning for Engineering Operations at Cisco, where he leads operations for a division of more than 22,000 employees. Before Cisco, he held senior leadership roles at Oracle, where he worked on large-scale cloud operations, program management, and process transformation across global teams. He believes that agentic AI only works when organizations first confront how fragmented workflows, tooling, and data actually behave in practice.

"AI can do amazing work, but only if the underlying data is high-quality and standardized. When everyone works differently, with data siloed in PowerPoints and one-off systems, AI can summarize but it can’t reliably take action," said Andrews. And the fragmentation is widespread. According to Atlassian’s State of Teams report, 56% of workers say the only way to get the information they need is to ask someone or schedule a meeting, even as 89% of executives say their organizations need to move faster than ever.

  • Process as plumbing: The first hurdle is the sprawl of inconsistent human work habits. For Andrews, the solution begins by redefining "infrastructure." He said that the business process itself should be treated as the foundation for AI, requiring the same focus that leaders traditionally give to physical hardware. "The infrastructure that leaders need to focus on is the business process itself. That level of standardization is what allows us to harvest mass amounts of data and effectively use AI to solve problems," he explained. He pointed to his own organization as proof, where a division of 15,000 engineers runs on just six standard workflows, creating the consistency needed to effectively leverage AI.

  • Building the AI-way: That view predates the current AI cycle. In earlier transformation work, Andrews framed the goal as "building the information highway to enable better data-driven decisions," starting with aligned processes and taxonomy before tools, so that AI has clean, connected information to work from.

  • In the clouds: Solving this process sprawl requires an architectural shift toward the cloud, said Andrews, because modern AI is built to operate inside shared, connected systems. "Modern AI solutions, including integrations from providers like OpenAI, plug directly into cloud applications. They simply do not plug into on-prem environments," he noted. "By consolidating our tools onto a single cloud platform, I can now integrate our different AI systems together because I have standard workflows that are linked. I've given them both context and allowed those AI agents to function back and forth."

"AI can do amazing work, but only if the underlying data is high-quality and standardized. When everyone works differently, with data siloed in PowerPoints and one-off systems, AI can summarize but it can’t reliably take action."

Jason Andrews

VP, Strategy & Planning for Engineering Operations
Cisco

In this new agentic environment, the consequences of bad data can escalate significantly. Where flawed data once led to a bad chart, it can now trigger serious operational failures that demand security models built for the new reality. Andrews laid out a scenario from an agentic workflow being developed for a data center help desk, where an AI is granted permission to manage physical assets.

  • The price of imprecision: Without that groundwork, Andrews said, organizations are effectively trying to run autonomous systems on an unfinished road. "With clean, documented data, an AI agent can automatically execute a task like configuring a port—and it's actually easy. But if that underlying data is wrong, that same simple action could shut off the wrong port, place it in the wrong subnet, and create security issues or operational outages."

  • Keeping up with the Joneses: The growing risk of autonomous failure is forcing a shift in leadership priorities as data governance and standardization move from background clean-up to competitive necessity. "Teams are getting backed into a corner because they see their peers moving faster, which creates an urgent focus on standard workflows as a prerequisite for building and leveraging AI," he said. That pressure favors organizations that have already done the unglamorous work of process discipline at scale. "Companies like Amazon built delivery machines by standardizing everything first, and that core process infrastructure is what enables the productivity gains that let organizations operate with much leaner teams."

Ultimately, this change is prompting a re-evaluation of a core IT principle: resiliency. In the age of agentic AI, the biggest threat can become an automation breaking because the trustworthy, high-quality data it relies on has silently drifted. In Andrews's view, the problem hasn’t disappeared; it has simply moved up the stack, requiring a new focus on deploying the technology responsibly and at scale.

"The bigger risk is no longer slow server performance. It's AI workflows breaking or taking wrong actions because the underlying data quality isn’t there. The problem doesn’t go away, it just shifts," Andrews concludes. "Resiliency still matters, but the focus moves from the quality of the systems behind the data to the quality of the data itself. That is the new concern for organizations."