Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Criminals are using artificial intelligence to increase their “attack rate” on UK victims, with investment fraud and romance scams hitting record levels in the first half of the year.
The number of confirmed fraud cases surpassed 2mn in the first half of this year — a 17 per cent rise on the previous year — according to statistics compiled by UK Finance, the banking trade body. The amount of money criminals stole from victims surpassed £629mn, a 3 per cent rise.
Fraudsters are using AI “to enhance tried and tested tactics more quickly, at a greater scale, in different languages and to a greater effect,” said Ben Donaldson, managing director of economic crime at UK Finance.
“It’s pretty much impossible for fraud victims to tell if AI has played a role or not, but anecdotally, it absolutely has,” he said. The cost and availability of the technology means it is fast becoming part of everyday life — even for criminals, who are using AI to scale up the distribution of scam texts, emails and direct messages on social media platforms and create ever more sophisticated content to deceive their victims.
Investment scam losses increased by 55 per cent to nearly £100mn in the period — an average loss of more than £15,000 per victim. The sophisticated tactics that criminals use are increasingly powered by AI, such as deepfake videos featuring trusted financial figures appearing to punt cryptocurrency investment opportunities or share tipping services.
Criminals often create fake websites allowing victims to log in and view a dashboard showing how well their “investment” has performed, and even withdraw some of the profits. However, this is a prelude to the scammers tempting them to invest even larger amounts.
Recommended
Cases of romance fraud rose by 19 per cent, with losses increasing by 35 per cent to £20.5mn. Typically carried out over a long period as scammers gain the trust of victims, there are an average of nine scam payments per case, UK Finance said, adding that it was aware of some cases involving over 100 separate money transfers.
Banks are also using AI to prevent fraud with increasing effectiveness. In the first six months of this year, £870mn of unauthorised fraud was prevented by advanced security systems — 20 per cent more than in the first half of 2024 — and equivalent to 70p in every £1 attempted.
Ruth Ray, director of fraud policy at UK Finance, said banks were increasingly investing in AI-powered tools that worked in real time to detect anomalies that could indicate a customer was “under the spell of a fraudster”.
However, this has led to a rise in lower value fraud. Criminals will often make multiple purchases of cheaper items that can easily be sold on — such as gift cards — in an attempt to bypass banks’ anti-fraud systems.
The use of other new technologies is rising too. The Dedicated Crime and Payment Card Unit (DCPCU), a specialist police unit sponsored by the banking industry, has caught criminals in crowded areas of central London using “SMS blasters” which act as an illegitimate phone mast. These have been concealed in the boots of cars and even inside suitcases on the London Underground network, blitzing every mobile phone in the vicinity with spam texts.
Victims typically click on a link leading to a fake website — such as a government body or delivery firm — where they are duped into handing over personal details.
UK Finance stressed the shared responsibility for social media companies and the telecommunications industry to step up fraud detection and prevention.
“Rather than rely on banks to try and prevent crime at the moment it’s happening, we should be working together to prevent fraud from occurring in the first place,” said Donaldson.
