Key Points

  • The AI industry faces a growing infrastructure problem as demand outpaces the power and capacity needed to run modern models.

  • Doneyli De Jesus, Solutions Architect at ClickHouse, said this pressure pushes companies like Anthropic to control more of their stack.

  • He points to long build times, rising energy needs, and durable enterprise revenue as the factors that make full-stack infrastructure the path forward.

The AI industry is no longer obsessed with whose model is cleverer. The real contest is unfolding in concrete, steel, and megawatts. Anthropic’s plan to pour tens of billions into its own data centers shows how the center of gravity has shifted toward companies that control every layer of their stack. The next wave of AI leaders will be the ones willing to build the foundation they plan to stand on.

Doneyli De Jesus* is an AI Architect and Strategist with over 20 years of experience translating C-suite goals into technical reality. Currently a Solutions Architect at ClickHouse and with a track record of architecting AI solutions at giants like Snowflake and Elastic, De Jesus has a frontline view of this infrastructure landscape. For him, Anthropic's move wasn't just smart. It was inevitable.

"If you want to optimize models effectively, you need to own the stack from silicon to software. That’s what Anthropic is doing now," said De Jesus. According to him, Anthropic's strategy follows a path already forged by rivals. OpenAI paved the way with its deep partnership with Microsoft, later expanding to include Oracle for more capacity. Meanwhile, Google has leveraged its own massive, long-standing infrastructure to push the boundaries of what's possible.

But why now? De Jesus explained that the timing is driven by two distinct forces: one technical, the other financial.

  • Durable by design: With model breakthroughs slowing, the edge moves to hardware optimization, a long term effort that only becomes possible once the business has a stable foundation. "Anthropic is at a stage where they have a stable base of enterprise customers and a healthy book of business committed for years into the future," De Jesus said. "Now they can go out and make those investments without being at risk of overleveraging on debt."

  • Small wins, big lift: De Jesus said Anthropic’s focus on enterprise customers has created a durable revenue base that makes this kind of expansion possible. He notes that they now stand out as the only major model maker running across all three major cloud platforms: AWS, Google, and Microsoft. "Because they own the stack end-to-end, they can find those marginal efficiencies. They train these models to be optimized for their own infrastructure, and they know the infrastructure so well that they can find these benefits."

"If you want to optimize models effectively, you need to own the stack from silicon to software. That’s what Anthropic is doing now."

Doneyli De Jesus

Solutions Architect
ClickHouse

The strategy of owning infrastructure also has geopolitical implications. Anthropic’s initial focus on American-based data centers reflects a growing concern among nations, as the geography of a data center is now treated as a matter of national security. Some nations fear that foreign infrastructure could be used as a "Trojan horse," making local control a top priority. De Jesus said this logic extends to allied nations, pointing to Canada as an 'obvious' candidate for expansion thanks to its abundant land, cheap electricity, and deep pool of AI talent.

  • Sovereignty stakes: "If we are to believe what has been promised with AI, we should also treat it as a national security topic. That’s why countries feel the need to keep highly sensitive data within their own borders, where they have end-to-end control and aren't dependent on another government." But beneath corporate strategy and geopolitics lies a more practical reality: energy.

  • Out of juice: De Jesus said the real constraint is energy, not model design, and that efficiency now depends on the physical limits of power, cooling, and infrastructure. "Efficiency isn’t just about training models; it’s about power, cooling, and the entire data center stack. These data centers are literally cities in terms of energy requirements, and experts talk about needing four to seven years just to get the grid ready to support them."

The energy bottleneck naturally raises the question of whether AI companies will become energy companies. De Jesus predicted they will likely make 'opportunistic investments' in energy partners, a strategy that treats power as part of the supply chain to be secured, rather than a new line of business to operate.

De Jesus said leaders feel pressure to invest now because capacity limits already show up in the real world, from slower model responses to usage throttling when infrastructure falls behind demand. "The risk of not making the investment now is higher than the risk of making it. These projects take years to build, so if you wait until you realize you need it, it will be too late. Right now, demand already outstrips the supply." That pressure is shaping every major decision in the AI ecosystem. "Energy is the bottleneck, and data is the fuel for this AI revolution."

The views and opinions expressed are those of Doneyli De Jesus and do not represent the official policy or position of any organization.