Highlights
- Government plans to ban AI tools that digitally remove clothing from images
- New offences target the creation and supply of nudification apps
- Measures form part of a wider strategy to cut violence against women and girls
Ban targets AI-powered image abuse
The UK government says it will ban so-called “nudification” apps, describing them as tools that fuel misogyny and online abuse. The announcement is made on Thursday as part of a broader plan to halve violence against women and girls.
Under the proposed laws, it will become illegal to create or supply artificial intelligence tools that allow users to edit images to make it appear as though a person’s clothing has been removed. The government says the offences will strengthen existing rules on sexually explicit deepfakes and intimate image abuse.
Technology Secretary Liz Kendall says women and girls deserve to be safe online as well as offline, warning that technology must not be used to abuse, humiliate or exploit people through non-consensual explicit imagery.
Building on existing online safety laws
Creating explicit deepfake images of someone without their consent is already a criminal offence under the Online Safety Act. The government says the new measures go further by targeting the technology itself.
Ms Kendall says the new offence will ensure that those who profit from nudification apps, or enable their use, face legal consequences.
Nudification, sometimes described as “de-clothing”, uses generative AI to create realistic images or videos that falsely show a person naked. Experts warn the technology can cause serious harm, particularly when used to generate child sexual abuse material.
Pressure from child safety campaigners
In April, the Children’s Commissioner for England, Dame Rachel de Souza, calls for a total ban on nudification apps, arguing that while creating such images is illegal, the technology that enables them should also be banned.
Child protection charities have long raised concerns about the scale of manipulated imagery online. The Internet Watch Foundation says 19% of young people using its Report Remove service report that some or all of their images have been altered.
The foundation’s chief executive, Kerry Smith, welcomes the move, saying nudification apps have “no reason to exist” and place children at greater risk, with images often ending up in harmful online spaces.
Working with tech firms on prevention
The government says it will work with technology companies to develop ways to tackle intimate image abuse. This includes ongoing collaboration with UK safety technology firm SafeToNet, which has developed AI tools designed to detect and block sexual content.
Similar systems are already used by platforms such as Meta to flag potential nudity, often to prevent children from creating or sharing intimate images.
While charities including the NSPCC welcome the ban, some say the proposals fall short. The organisation says it is disappointed not to see stronger commitments to mandatory protections built directly into devices, and continues to call for tougher action to prevent the spread of child sexual abuse material, including in private messages.
The government says its aim is to make it impossible for children to take, share or view nude images on their phones.













