"The pace of change has never been this rapid, and it will never be as slow as it is again after today. People are overwhelmed, anxious, and trying to process what it all means for their future."
Rob Zelinka
Technology Advisor | Leadership, Life & Legacy
Ex-CIO | Jack Henry

Enterprise AI is advancing faster than most organizations can absorb, and the human cost is showing. Workforce reductions fund new AI initiatives while loyalty and institutional knowledge are the casualties. When the pace of change outstrips employees' ability to process it, the result is resistance and disengagement. Worse, when AI messaging from the top fails to match what employees experience, trust erodes, and tech rollouts stall.

Rob Zelinka, former VP and CIO at Jack Henry & Associates, has spent three decades leading technology organizations through exactly this kind of change. At Jack Henry, he managed a $200 million annual budget and led a cross-functional organization of more than 500 people. He now also channels that experience into Leadership, Life & Legacy, a platform exploring servant leadership, trust-building, and relational intelligence. He believes the biggest risk in today's AI era is not moving too slowly, but moving faster than people can follow.

"The pace of change has never been this rapid, and it will never be as slow as it is again after today. People are overwhelmed, anxious, and trying to process what it all means for their future," said Zelinka. For him, that anxiety has a specific source: people watching loyalty and tenure offer no protection as organizations fund their AI ambitions through headcount reductions.

Boards and C-suites have moved well beyond the AI experimentation phase, and as they scrutinize business cases for evidence of reduced friction or faster revenue, some organizations are funding AI initiatives through large-scale workforce reductions. Zelinka viewed that approach as a strategic miscalculation: eliminating tenured employees to fund AI quietly strips away the client empathy and institutional knowledge that no model can replicate. That same pressure is also forcing a rethink of what effective technology leadership actually requires.

  • The cost of capital: Zelinka pointed to a recent example of a major tech company eliminating 14,000 roles to fund AI investments. "They are funding their investments in technology by removing people, and that's a slippery slope to be on. People who are loyal to a company, who are hardworking, who are committed to a company are caught in the crosshairs," said Zelinka. The math is only half the story. "How long does it take you to train someone from off the street that comes into your company?" he asked, noting the operational cost of replacing experienced workers. "And how quickly will they learn the culture, how you operate and what makes you different?"

  • Lost in translation: The resulting friction is even rewriting the CTO role. "Oftentimes, a CTO is a Chief Translation Officer," said Zelinka. "Everybody sees the CTO role as a Chief Technology Officer, but here's what you're translating: technical people must understand the business language. What problems are we trying to solve?"

IT initiatives sometimes struggle to hit initial ROI projections, leaving technology leaders battling a perception of weak financial planning. To overcome that, Zelinka suggested sharpening situational awareness and tailoring the narrative to each stakeholder: a CEO focuses on growth, a COO on operational stability, a CFO on how investments in people, process, and product get paid for over time. In many cases, those leaders are all in the same room, each listening for different signals. Without clear translation and the proactive work of establishing guardrails, organizations create compliance risks from the most unexpected source: their own well-meaning employees.

  • Accidental exposure: "The majority of people that work at a company are well-intentioned humans that want to do good work," said Zelinka. "And so they are plugging sensitive information into an AI engine, not realizing the risks." He described a scenario where a healthcare worker plugs patient data into an AI tool to check drug side effects, unaware they just pushed protected health information into a public model's training corpus. The question gets answered and the task gets completed, but the organization now faces a potential HIPAA violation. Zelinka viewed this less as employee malice than as a clear signal that organizations need safer internal options and better governance frameworks.

Getting the broader organization on board requires confronting a harder truth: for many technology workers, the roles they have spent years mastering are disappearing, not evolving. That pressure stalls adoption when leaders blur the line between upskilling and reskilling. Upskilling builds on an employee's existing role, extending familiar capabilities. Reskilling asks people to reinvent themselves entirely, and the pace at which organizations expect that reinvention is often unrealistic.

  • From pets to cattle: "In the world we're moving to, where all of the servers are running on their own, it's called Infrastructure as Code," explained Zelinka. "The system engineer is no longer managing pets, but rather cattle. They're all managed the same, and they're managed by a system. And so the need for the human to be in the loop there doing the work is significantly reduced." For technology workers who have spent years cultivating expertise in those environments, that shift is not abstract: it is the source of much of the anxiety playing out across the industry right now.

  • The dentist's dilemma: For Zelinka, an inability to adapt quickly is rarely a failure of drive. "If a dentist was asked to become a heart surgeon, how long would it take? It probably would take a long time," noted Zelinka. "It doesn't mean the person is incapable. It doesn't mean that they don't have a desire. And the same is the case for technology people." In either case, he suggested leaders ask three blunt questions: Does the individual want to make the change? Do they have the capacity to do it? And does the organization realistically have the time to let that learning take place?

To break that cycle, Zelinka came back to culture and habit. While technology can pivot overnight, human habits take far longer to shift. He compared daily workflows to driving home on autopilot while on the phone, arriving safely without remembering each turn. It illustrates how deeply ingrained workplace routines can become, and no governance framework or deployment timeline accounts for that gap.

  • Clearing the air: "If you invite people to the journey early and often and create and foster an environment where openness, collaboration, is not only encouraged but expected," said Zelinka, "you have to create a psychologically safe environment where people will be willing to say, 'Hey, look. This is all new technology. I'm a little anxious about it. I want to learn it, but this is what makes me nervous.' And then have people embrace you."

For Zelinka, that safety ultimately depends on honest intent. He recalled a 10-K drafting session at a previous company where the executive team included the familiar line about people being their greatest asset, until the CEO asked them to remove it. "The companies that say people are their number one asset usually are not modeling that behavior," said Zelinka. "He was worried that we would be putting ourselves in that bucket of companies." That firm chose instead to let its history carry the message. Its founders had launched the business decades earlier with a simple belief: take care of your people, and they will take care of your clients, who in turn will tell others you are a good company to work with. That mantra held through fifty years of leadership changes and demographic shifts, a reminder that culture built on genuine behavior compounds in ways that no AI deployment timeline can replicate.

"If you truly are going to use technology, whether it's AI or automation, to reduce operational expense, then just tell people that," said Zelinka. "Don't misrepresent your intent and purpose. Because once you lose credibility, it's gone. You don't usually get it back."