AI in Financial Services
- Later Life Ambitions
- Apr 1
- 7 min read
House of Commons Treasury Select Committee
Submission by Later Life Ambitions
April 2025
Rationale
AI is being deployed across various sectors within financial services, including retail banking, investment banking, insurance, and pensions. The purpose of this inquiry on “AI in financial services” is to explore how the UK financial services can take advantage of the opportunities in AI while mitigating any threats to financial stability and safeguarding financial consumers, particularly vulnerable consumers.
This inquiry seeks to understand how financial services can utilise AI whilst protecting consumers against potential risks. Our submission will focus on the benefits and risks to consumers arising from AI, particularly for vulnerable consumers.
About Later Life Ambitions
Later Life Ambitions (LLA) is a coalition that brings together the collective voices of over a quarter of a million pensioners from all four nations of the United Kingdom.
LLA consists of three member organisations: the Civil Service Pensioners’ Alliance (CSPA), the National Association of Retired Police Officers (NARPO) and the National Federation of Occupational Pensioners (NFOP).
As a campaigning group, LLA has ambitious aspirations for the next generation of pensioners. From fair pensions to safe and sustainable care services, and from accessible housing to regular bus services to promote independence, LLA calls for bold and forward-looking action from political leaders and other decisionmakers.
In the UK today, there are 12.95 million people in receipt of the State Pension1; by 2030, there will likely be more than 13 million people in later life – each one with hopes, aspirations and needs. The issues pensioners face today will also impact on future generations of pensioners unless today’s policymakers are willing to confront the challenges now. LLA welcomes the Committee’s inquiry as an important opportunity to examine the impact of emerging technology such as AI on the accessibility of financial services.
Our response to this inquiry focuses on the potential risk to the economy and society of greater AI deployment in financial services for those already at risk of financial exclusion or those with poor digital literacy. We want people in later life to be able to have ambitions for their futures and live fulfilling lives without societal obstacles and hinderances getting in the way. Our members reflect on their difficulties utilising these services, which will only be exacerbated by the AI revolution which would automate and anonymise decisions which could impede them from living the lives they want to lead or increase AI fraud risk with such widespread deployment of this technology.
Our call: ensuring accessibility and equity
The Bank of England and Financial Conduct Authority’s November 2024 survey into the use of AI in financial services found that 75% of firms are already using artificial intelligence (AI), with a further 10% planning to use AI over the next three years.1 While AI offers benefits, ethical considerations and accessibility issues must be addressed to ensure these technologies serve those in later life fairly and effectively. Our members’ concerns regarding data privacy, algorithmic bias and the digital divide highlight the importance of developing transparent, accountable AI systems and enhancing digital literacy among older adults.
Whilst artificial intelligence presents opportunities to improve internal productivity and resourcing constraints, AI must not replace the human experience. Some older people struggle with chatbots, especially if they do not realise there is no human responding to their query in real time. As such, the provision of alternative hotlines staffed by people must be an essential requirement to assist older or vulnerable people trying to ask a question or access a financial service (to be expanded upon in paragraph 15)
If used effectively it should help free up humans to give a more personalised and nuanced response to vulnerable customers who need the most support when making financial decisions – but the transition might be a gradual one.
What benefits to consumers might arise from using AI in financial services? For example, could AI be used to identify and provide greater assistance to vulnerable customers.
With technological advancement in the AI space, perpetrators are evolving new and nefarious ways to obtain the trust, and eventually money or resources, of older people. With the rise in use and development of generative AI, scammers are profiting far more from scam calls than ever before, presenting further challenges to the financial and emotional wellbeing of those in later life.
The principal benefit to older people of widespread AI adoption across financial services concerns scam detection and fraud protection. As a demographic more susceptible to deceit and confidence-trickery online, AI-powered fraud detection systems can monitor transactions for unusual activity and flag suspicious behaviour. In a similar vein, AI chatbots, if correctly trained, can provide real-time alerts for suspicious activity which will aid older people’s use of digital financial services, which is especially important as banks and other financial institutions continue to close physical facilities at an alarming rate.
In addition, AI could be used to identify and ‘red flag’ vulnerable people struggling with automated chatbot responses, or other indicators of vulnerability, and automatically switch them to a human to deal with their query or issue. Indeed, many customers are unaware of their vulnerable status. When assessed against Financial Conduct Authority (FCA) criteria 2, two-thirds of UK adults – equivalent to 35 million people – are potentially vulnerable, often without realising it. This highlights that many individuals live in vulnerable conditions unaware, leaving them at greater risk as technology and company policies evolve.
As financial services companies begin to deploy AI tools and processes more widely, there will be increased granularity of customer data to continually train AI models. This could improve a bank’s understanding of customer behaviour and transaction patterns which, in turn, could identify early signs of vulnerability.
Are there are any current or future concerns around data protection and AI in financial services?
Data privacy and security concerns are paramount, as AI systems require access to sensitive personal and financial information to offer personalised advice. Breaches in these systems could expose customers to financial fraud and identity theft. Som older people feel particularly exposed to privacy threats: they may not be digitally literate and feel less able to identify phishing threats or social engineering schemes which enable malicious actors to gain access to their data or personal IT systems.
This concern is exacerbated by the use of third-party providers which, unlike banks, are not presently considered critical national infrastructure and may not have the same level of cybersecurity as the customer-facing financial services organisation.
What sort of safeguards need to be in place to protect customer data and prevent bias.
It will be of paramount importance that older customers, many of whom are not are not acquainted with artificial intelligence, are aware that any decisions made on their finances, or any advice received, is provided by AI rather than an employee – transparency and ‘explainability’, as described by the FCA and Prudential Regulation Authority.
Similarly, companies providing financial services need to be attune to the impact of digital exclusion and the relative lack of digital literacy amongst some older people. Whilst there are many older people who are quite comfortable using IT, not all older people can access technology and benefit from AI. They may not be able to afford the new platforms on which AI will operate, nor have access in certain parts of the country where superfast broadband and 5G have yet to be installed.
If a customer requests to speak to a human to discuss a case or divulge personal information rather than engage with a chatbot, this request should be granted, especially if there is an early indication of customer vulnerability.
Fundamentally, AI systems are trained on data which might be unbalanced. For example, they may produce outputs which discriminate on the basis of age, amongst other characteristics, without fair justification. This risk may subside over time as the body of data on which a bank can train its model grows, but the initial risk of discrimination remains.
What is the risk of AI increasing embedded bias. Is AI likely to be more biased than humans?
Man older people have been overwhelmingly excluded from the development of AI technology and due to wider digital exclusion, their perspective has been sidelined from training data which has embedded algorithmic bias into many AI systems. As referenced in paragraph 2, when dealing with financial matters, there must be clearly defined oversight of any systems and customers must have access to customer services hotlines manned by humans, rather than AI-powered chatbots or similar such technologies.
If AI chatbots or automated customer services hotlines, for example, supersede the use of human customer service advisers, those ill-acquainted with technology will only struggle more to access the essential services on which they rely. This is especially true as the number of banks, building society and post office branches are continuing to decline at an alarming rate with some communities and high streets left without the physical presence of any financial institutions – many financial institutions have opted to invest in better digital services such as AI chatbots, rather than bricks-and-mortar branches.
Like any model, successful deployment of AI will rely on appropriate inputs which mitigate any discriminatory tendencies exhibited by models. Enshrining the principle of algorithmic fairness into AI systems and processes used in financial services will require a formal encoding of the different social groups and/or characteristics into the dataset or algorithm. Without this, AI algorithms may perpetuate or even worsen inequalities, offering suboptimal advice to various demographic groups.
AI should be designed according to the principles of universal design – that is, a technology which responds to the needs of all age groups and does not prioritise or exclude any demographic fraction—an ageless AI .The inclusion of older people as “relevant social groups” during the development of AI technology is a necessary step to safeguarding the fairness and inclusiveness of the products and services when deployed in financial services.
Later Life Ambitions’ manifesto
Standing by Pensioners set out our vision for a better deal for older people through six key asks. We are calling for:
The UK Government and the devolved administrations to combat digital technology’s role in social exclusion and access to vital services.
Investment in local bus and rail services for uprated concessions and for improved accessibility and assistance for older people on all new bus and railway stock and facilities.
The UK Government to create an Older People and Ageing Commissioner for England and Scotland, following the examples set by Northern Ireland and Wales.
A National Social Care Service integrated with the National Health Service that remains free at the point of delivery.
A commitment to guarantee the State Pension triple lock for at least the duration of the current Parliament.
All new homes to meet the Lifetime Homes standard with a national strategy for more adaptable, accessible homes across all tenures.
Conclusion:
LLA wants to see a comprehensive strategy for ensuring the widespread of AI systems and tools do not adversely impact older people’s understanding of and access to vital financial services.
The Department for Science, Innovation and Technology and the Financial Conduct Authority are well placed to understand and consult with those that rely on these services to participate in society, and with the funding allocated by central government, there is an opportunity to improve older people’s quality of life. Ease of access and safety for vulnerable customers must be prioritised, both in existing legislative and regulatory frameworks – and in future product and service development processes.
We will work with everyone concerned to reflect the views of our members and find solutions across the board. Please let us know if you require any further information or case studies as part of your inquiry.