19.12.2025
Reading time: 3 min

UK Moves to Prohibit Deepfake Nudification Applications

UK to ban deepfake AI 'nudification' apps

The UK government has announced its intention to outlaw applications that create so-called “nudification” images, as part of its broader campaign to combat online misogyny.

New legislation revealed on Thursday aims to cut violence against women and girls in half and will criminalize the development and distribution of AI tools that enable users to alter images to make it appear as though individuals are undressed.

This forthcoming law will build upon existing regulations concerning sexually explicit deepfake content and the misuse of intimate images, according to government officials.

“Women and girls should feel secure both online and offline,” remarked Technology Secretary Liz Kendall.

She emphasized, “We will not remain passive while technology is exploited to abuse, demean, and take advantage of them through the creation of non-consensual sexual deepfakes.”

Under the current Online Safety Act, generating explicit deepfake images without the subject’s consent is already deemed illegal.

Kendall further explained that the new legislation would render it unlawful to create or distribute nudification applications, ensuring that “those who profit from these tools or facilitate their usage will face severe legal consequences.”

Nudification applications employ generative AI to convincingly simulate the removal of clothing from individuals in photos or videos.

Experts have raised alarms regarding the proliferation of such tools, highlighting their potential to cause significant harm to victims, particularly when utilized to produce child sexual abuse material (CSAM).

In April, Dame Rachel de Souza, the Children’s Commissioner for England, called for an outright ban on nudification applications.

“The act of producing such images is rightly illegal; the technology that enables it should be as well,”

she stated in a report.

The government confirmed on Thursday its commitment to collaborate with technology firms to devise strategies for addressing intimate image exploitation.

This initiative will continue its partnership with SafeToNet, a UK safety technology company.

SafeToNet has developed AI tools that claim to identify and block sexual content, as well as deactivate cameras when they detect the capture of such material.

This technology builds on existing filters used by platforms like Meta to identify and flag potential nudity in images, primarily to prevent children from capturing or sharing explicit images of themselves.

The proposed ban on nudification applications follows earlier appeals from child protection organizations urging the government to take action against these technologies.

The Internet Watch Foundation (IWF), which operates the Report Remove helpline allowing minors to confidentially report explicit images of themselves online, revealed that 19% of confirmed reportees indicated that some or all of their images had been altered.

Kerry Smith, the IWF’s chief executive, expressed support for the new measures.

“We are pleased to see tangible steps towards banning these so-called nudification applications, which should not exist as products,”

she said.

“Such applications pose significant risks to real children, and we observe that the imagery produced is often exploited in some of the darkest areas of the internet.”

While the NSPCC welcomed the announcement, its director of strategy, Dr. Maria Neophytou, expressed disappointment over the absence of similar ambitions to enforce mandatory protective measures at the device level.

The charity is among several organizations advocating for the government to require tech companies to implement more effective ways of identifying and preventing the dissemination of CSAM on their platforms, including private messaging systems.

The government stated it aims to make it “impossible” for children to capture, share, or access nude images on their devices.

Furthermore, it is pursuing legislation to prohibit AI tools intended for the creation or dissemination of CSAM.

Comments

Leave a Comment