

The CISO’s role in higher education is shifting. As universities accelerate AI adoption across every function, security leaders who once focused on risk prevention are being asked to do something harder: guide institutions on how to implement AI in a secure, private, and ethical manner. Whether security becomes a brake pedal or a throughput engine depends on how leaders show up to that conversation.
Lester Godsey is a leader at the center of this transformation. As the Chief Information Security Officer for Arizona State University (ASU), Godsey is responsible for protecting a highly innovative academic institution. His perspective is built on over 30 years of IT experience, including CISO roles for Maricopa County, the fourth-largest county in the United States, and the City of Mesa. He is now championing a model where security functions as an accelerator.
“We’re using multi-agent frameworks to orchestrate security workflows, so routine tasks like incident triage and identity management happen automatically, enabling us to continually scale innovation while keeping sensitive data under control," Godsey said. At ASU, agentic automation is already operational, and the security controls that govern it were built in from the start, not bolted on after.
Godsey’s teams embed security directly into the university’s daily workflows. The goal is to reduce friction by turning compliance burdens into on-demand, self-serve tools that make the secure path the easiest one to take.
Policy to portal: Nobody was going to read 19 new security standards, so the team transformed the policies into a self-serve Q&A bot. "Instead of expecting our users to read all of them, we gave them access to a bot trained on our existing policies and standards. They could ask any question they wanted around multifactor authentication, encryption, passwords, or elevated privileges," Godsey explained.
Expertise on demand: The team captured the institutional knowledge of their most seasoned identity management experts and fed it into an AI bot for the university help desk. "The bot gives technicians immediate access to accurate answers, allowing them to better support the person on the other end, who is on a phone or online," Godsey said.
This philosophy of enablement extends to the university’s high-profile initiatives. Godsey noted that the accelerated adoption of AI is shining a greater light on problems that have plagued technology for decades. He believes few organizations can confidently answer where all their data lives, what it is, and who has access.
Old problems, new catalyst: That insight became a driver for ASU’s significant partnership with OpenAI, a move that reflects a clear approach to enterprise AI governance by addressing the data challenge. "The challenges AI exposes within higher education, like unknown data access and classification gaps, aren’t new. We’re finally using automation to tackle decades-old problems at scale," Godsey said.
Enabling with guardrails: The university's approach to vendor agreements reflects the same logic. "We brokered an enterprise agreement that gives every single faculty, staff, and student at ASU access to ChatGPT Edu. The platform provides guardrails and privacy pre-agreements that prevent the university's data from being used to train models," he added. The expansive collaboration is now featured by OpenAI as a case study in responsible, large-scale deployment.




