Key Points

  • Launched in 2018, Bank of America's AI virtual assistant Erica  has supported over 2.4 billion interactions, across over 42 million clients.
  • Internally, over 90% of Bank of America’s 213,000 employees leverage “Erica for Employees,” its in-house AI agent, leading to outcomes like a 50% decrease in IT service calls.
  • With plans to spend $4 billion this year on AI, Bank of America achieved success where others failed by following a simple strategy: start small.
  • Among other tactics, Bank of America makes sure its investments align directly with customer needs, according to Hari Gopalkrishnan, head of consumer, business and wealth management technology.

Back in 2018, well before the current AI craze began, Bank of America launched a virtual assistant called Erica. Fast-forward seven years, and the system has supported over 2.4 billion interactions, across over 42 million clients.

Meanwhile, over 90% of Bank of America’s 213,000 employees leverage “Erica for Employees,” its in-house AI agent, leading to outcomes like a 50% decrease in IT service calls. And now, the financial institution is doubling-down on that success with $4 billion in AI investments, including in its “Ask Merrill” AI agent, among other areas.

"AI is having a transformative effect on employee efficiency and operational excellence," said Chief Technology & Information Officer Aditya Bhasin. "Our use of AI at scale and around the world enables us to further enhance our capabilities, improve employee productivity and client service, and drive business growth."

In a world where many companies are struggling to drive the outcomes they want with data and AI, Bank of America achieved success by following an increasingly common strategy for CIOs and other technology leaders: start small.

In the past, IT transformations were often considered huge, multi-year investments. But as budgets get tighter and expectations for returns get higher, CIOs and other technology leaders are prioritizing smaller, modular projects that deliver quick value, and help prove-out early use cases before investing deeper. And for Bank of America, that means staying focused on the real opportunity: AI and orchestration.

“We’re not looking to chase the next shiny thing that just got announced somewhere because there’s plenty of things that can be done with what’s already available through simple common sense of AI agents with basic orchestration,” Hari Gopalkrishnan, head of consumer, business and wealth management technology, told CIO.com.

"Our use of AI at scale and around the world enables us to further enhance our capabilities, improve employee productivity and client service, and drive business growth."

Aditya Bhasin

Chief Technology & Information Officer

Bank of America

Recently, Hari unpacked how Bank of America was able to scale Erica to become a leading virtual assistant in the financial services industry, and outlined how other organizations can start small with their own AI investments:

  • Map investments to customer needs: “If you start by asking ‘how can I take this cool technology to market?’ you’re going to spend a lot of money and it’s going to fail. It has to map back to what the customer needs,” Hari told CIODive.
  • Aim for gradual improvement: “Over time, it went from a model that was 80% accurate to 85% to well in the north of 90% accurate,” Hari told CIO.com.
  • Create cross-functional teams: “ It was the engineering team, the UX team, the appropriate legal team, all opining day in, day out, on all aspects of the platform. The sprints were not just engineers running off and UX coming in weeks later. It had UX teams embedded. In fact, when I used to visit the teams, it was sometimes hard to tell who was in the design team and who was in the engineering team. That was actually the power of how this came together,” Hari said on the “Tearsheet” podcast.
  • Flexibility is paramount: “We’ve got multiple availability zones in our virtual private cloud. We extensively use our virtual private cloud, and as need be, we can burst into public clouds based on the use cases, either for other software providers or for ourselves,” Hari told CIO.com.
  • Find ways to use new technology to maximize existing capabilities: “As we look at the emergence of Generative AI, we actually see that classification can actually get a lot better. You can actually talk even more naturally in a natural language. So that is just a natural sort of expansion of where we go with Erica,” Hari said on the podcast.
  • Embrace hybrid infrastructure: “The mainframe continues to be a very important strategic platform. But over time, we’ve absolutely modernized, figured out what workloads actually belong better in a distributed environment, which workload should be more horizontally scalable across multiple availability zones, and which workloads would be irresponsible for us to just go through a lot of money and rewrite just for the sake of rewriting,” Hari told CIO.com.  
  • Lead with the use case, not the LLM: “We don’t want to be wedded to any given model. Essentially, we look at a use case, we look at data classification, we look at our capabilities, and then put together what the right solution for the problem,” Hari told CIO.com.