Legislation expected this year

The UK government is set to press technology companies to introduce default nudity-blocking software, as part of its continued attempt to legislate child safety online.

Ministers want companies like Apple and Google to build nudity-detection algorithms directly into their operating systems, preventing users from taking photos or sharing explicit images of children unless they have verified that they are adults.

Under the proposal, adults would need to confirm their age – potentially through biometric checks or official identification – to disable the blocks and create or access such content.

People familiar with matter told the Financial Times that the measures are expected to be included in a new Home Office strategy to tackle violence against women and girls, set to be unveiled in the coming days.

Officials considered making nudity-blocking mandatory for all devices sold in the UK, but have opted not to pursue that route for now. Instead, the Home Office is expected to urge companies to adopt the measures on a voluntary basis.

While both Apple and Google have developed tools aimed at warning younger users about sensitive content, these protections are limited. Sensitive content warnings can typically be overridden by entering a passcode.

Safeguarding minister Jess Phillips has publicly praised companies that have already developed technology aimed at addressing these risks. She has highlighted HMD Global, which has launched a device designed for children that automatically detects and blocks explicit imagery.

The software used on the device, known as HarmBlock, is produced by UK-based company SafeToNet.

Although the new policy initially centres on smartphones, officials say the same approach could be extended to desktops. They point to existing tools, such as Microsoft Teams’ ability to scan for “inappropriate content”, as evidence that similar systems could work across different devices.

The plans are likely to face resistance from privacy and civil liberties campaigners, as well as scepticism over how effective the technology would be in practice.

Earlier this year, when the UK introduced age checks for pornographic websites under the Online Safety Act, users were easily able to bypass the restrictions using fake photographs or VPNs.

The Online Safety Act requires platforms that host adult content to put age-verification or age-assurance systems in place to prevent access by under-18s. The definition of “platforms” spans a broad range, from social media and search engines to gaming and pornography sites.

UK officials say the latest proposals are designed to sit alongside the Online Safety Act.

The UK is not alone in its attempt to legislate child protection online. In July, the European Commission unveiled a set of guidelines and a prototype for an age verification app designed to create a safer online space for children; and just this month, Australia implemented its social media ban for under-16s.