How Bank Boards Can Support the Implementation of AI
Developing and deploying an effective AI program requires clear direction and strong oversight from the board.
Brought to you by EY
Each year, Ernst & Young LLP and the Institute of International Finance survey financial institutions to better understand the direction of the banking industry and the challenges ahead. The most recent survey makes clear that financial services institutions continue to move forward with artificial intelligence (AI) and machine learning (ML) and are emphasizing governance of those processes. More than 8 in 10 (84%) report the wide use of AI models in production today. Eighty-six percent expect the integration of generative AI (GenAI) to significantly or moderately increase the number of models in their inventory. Many recognize the greater governance burden and have assigned a C-suite executive to oversee AI/ML ethics and governance or are in the process of doing so.
Bank boards are in a unique position to support their organizations as they deploy these new models. We recently identified several key areas where boards can support their organizations, accelerate the implementation of AI and prepare themselves for effective oversight.
Three Actions That Bank Boards Can Consider
1. Support a Strong Technology Foundation for Accurate Data
For many banks, data remediation and deploying next-generation infrastructure can be expensive and time-consuming. Boards should play an active role in overseeing the large capital expenditures that are likely necessary to prepare for this next wave of technology. They can maintain accountability by challenging overly optimistic plans while supporting necessary investments. Boards should also ask for regular updates from management to understand the pace of the work and how it supports the overall data requirements for AI to thrive.
2. Support the Integration of AI Into the Organization
Boards can guide management to take a broader perspective around AI. Instead of focusing solely on whether AI can improve a specific process or drive efficiency, they can direct management to determine how AI can help differentiate the business from competitors. They can also ask questions about potential new competitors, as the democratization of AI will increasingly enable smaller organizations to break apart and challenge the value chains of larger companies.
3. Weave Responsible AI Into the Fabric of the Company
AI creates new risks for companies in areas such as intellectual property rights, bias and model transparency. Responsible AI programs should add a governance and oversight layer to manage the risks in development and deployment. Getting AI wrong may cause harm to companies and their boards. AI policy oversight should be a key board mandate and companies should integrate responsible AI framework into their overall risk management programs. Boards may seek to establish regular and systematic updates from management on AI opportunities and risks, confirm that AI is on the agenda for board meetings and act on any noted issues or red flags. It may be helpful to designate a single person from management with an overarching responsibility to manage AI business, operational and risk management priorities.
How Directors Can Restructure Their Board for AI Oversight
Ensure that the right skills are in the boardroom.
A recent observation of the trends impacting board composition in financial services revealed that high-performing bank boards are careful to continually augment and update their own skills, knowledge and composition. We are seeing increased focus on current director education and reboarding, as well as onboarding of new directors, to ensure that directors and boards remain fit to provide clear oversight and strategic guidance for their companies — particularly regarding quickly moving technologies. Additionally, the demand for technology and operations expertise is rising, fueled by emerging technologies and the increasingly complex matrix of risks and opportunities that they enable.
Determine where oversight of AI should reside.
To help their organizations fully leverage the benefits of AI, boards will need to have discussions that go beyond risk and compliance policies. A common question is whether there should be a designated technology committee. The answer is, it depends. Key questions that directors should consider around a committee include:
- Given the size of the board, what is the ideal number of committees to work effectively?
- Which committee has the bandwidth and resources to address another key topic, like AI oversight?
- Do the committee’s new responsibilities overlap with the scopes of other committees?
- Will board charters need to be reviewed and updated if new committees or responsibilities are added?
The views reflected in this article are the views of the author(s) and do not necessarily reflect the views of Ernst & Young LLP or other members of the global EY organization.