Thousands of women in Scotland are being targeted by revenge porn created using AI technology that is increasingly used to create lifelike but fake sexualised images.

The Revenge Porn Helpline, part of the SWGfl charity, said “intimate image abuse” was now at record levels but insisted the rising number of cases being reported to it were only the “tip of the iceberg”.

The helpline found 1.4 per cent of women in the UK are now being targeted every year — almost 30,000 in Scotland, but said these were only the cases brought to its attention.

The escalation, the helpline said, is partly being driven by the development of artificial-intelligence software that allows pictures to be stolen from victims’ social media accounts and turned into pornography, which can then be shared, swapped or sold online.

Perpetrators are using “nudification” apps, which can take an image of a person and apparently remove their clothes to make it appear sexualised, while other software can insert a victim’s face into a sexualised picture or video.

Sharing, or threatening to share, intimate images without consent is a crime under the Abusive Behaviour and Sexual Harm (Scotland) Act 2016. Even if a victim originally agreed to the pictures being taken, it is an offence to share them by text, social media or even show them to another person.

In one of the first cases of its kind in Scotland, a man was convicted in August of taking a former schoolfriend’s picture from the internet and using AI to create fake nude images of her, which he then distributed.

Callum Brooks at Glasgow Sheriff Court where he admitted using AI to create nudes with his female High School chums face on them and then shared them with friends, he was fined £335.

Callum Brooks at Glasgow Sheriff Court

SPINDRIFT

Callum Brooks, from Glasgow, 25, admitted taking photos from a former schoolmate’s social-media posts and using AI to generate fake naked pictures of her, which he then sent to his friends. After he was fined £335 at Glasgow Sheriff Court, the victim said: “The photo of me fully clothed had been altered so it was completely naked. It was a big feeling of betrayal.”

Kate Worthington, a senior practitioner at the Revenge Porn Helpline, told The Scottish Mail on Sunday that intimate image abuse was highly distressing for victims. “It can be hugely embarrassing and humiliating, and can lead to mental health impacts.

“It creates a feeling of paranoia. If it’s sunny and someone smiles at you, are they smiling because it’s a sunny day or because they’ve seen your intimate images online?”

Worthington claimed the technology for generating fake content is developing all the time and is one reason for the rise in cases. “What’s worse, this content often starts with images taken from the victim’s social media,” she said. “It’s scary. The concern is that our figure of 1.4 per cent is only the tip of the iceberg. It’s based on numbers of women reporting abuse to a support service but we’re worried there are many more who do not feel able to reach out for support.”

The helpline is working with police forces to improve the way they respond to intimate image abuse amid fears that some officers dismiss victims or trivialise their experiences.

Detective Chief Inspector Gary Sergeant of Police Scotland’s domestic abuse co-ordination unit said: “We are experiencing an increase in the number of such incidents being reported. We want victims to know they will be treated with respect and dignity.”