MPs warn AI risks threaten UK financial system

Heavy reliance on a few US tech groups adds another layer of risk, MPs say

MPs warn AI risks threaten UK financial system

Transformation

By Jonalyn Cueto

The UK’s financial system faces potentially serious harm from artificial intelligence because of inadequate regulatory oversight, according to a Treasury Select Committee report published on Tuesday.

The committee criticised the Bank of England, the Financial Conduct Authority (FCA), and the Treasury for adopting a “wait-and-see” approach while more than 75% of UK financial services firms now use AI technology. Insurers and international banks represent the largest adopters, using the technology to automate administrative functions and deliver core services, including processing insurance claims and credit assessments.

“Based on the evidence I’ve seen, I do not feel confident that our financial system is prepared if there was a major AI-related incident, and that is worrying,” said Dame Meg Hillier, chair of the Treasury Select Committee. “I want to see our public financial institutions take a more proactive approach to protecting us against that risk.”

The committee said it had received a “significant volume of evidence” about AI risks to financial services consumers, The Independent reported. Concerns included a lack of transparency in AI-driven decision-making for credit and insurance, financial exclusion for disadvantaged customers, misinformation from unregulated AI search engines and increased fraud risks.

The report warned that AI-driven market trading could amplify herding behaviour, potentially triggering a financial crisis. Evidence also highlighted UK firms’ reliance on a small number of US technology companies for AI and cloud services, alongside increased cyber security risks.

No AI-specific rules in current framework

The UK currently has no AI-specific legislation or financial regulation. The FCA and the Bank of England rely on existing regulatory frameworks to supervise firms’ use of AI, an approach MPs said creates uncertainty for businesses and increases risks to consumers and to the integrity of the financial system.

The committee recommended that the Bank of England and the FCA conduct AI-specific stress testing to assess businesses’ readiness for future AI-driven market shocks. MPs also called for the FCA to publish practical guidance on AI by the end of the year, clarifying how consumer protection rules apply and establishing accountability for AI-caused harm.

Despite the Critical Third Parties Regime being established more than a year ago to oversee non-financial firms providing critical services to UK financial services, no organisations have been designated. The committee urged the Government to designate AI and cloud providers deemed critical in order to improve oversight and resilience.

A Bank of England spokesperson told The Independent the institution had “already taken active steps to assess AI-related risks” and would consider the recommendations carefully. A Treasury spokesperson said the Government would “strike the right balance between managing the risks posed by AI and unlocking its huge potential.”

 

Related Stories

Keep up with the latest news and events

Join our mailing list, it’s free!