Deepfake Abuse: Now Illegal
This week marks a significant moment in the fight against image-based abuse
Following months of campaigning by survivors, advocates and organisations including End Violence Against Women Coalition, Not Your Porn, Glamour UK, Professor Clare McGlynn and survivor-campaigner Jodie, a new law has come into force criminalising the creation of non-consensual intimate images - including AI-generated deepfakes.
For survivors, this is a welcome and long-overdue step.
The law now makes it a criminal offence to create or request the creation of intimate images of an adult without their consent, including sexually explicit deepfake images. Sharing or threatening to share intimate images without consent - already illegal under existing laws - is also clearly reaffirmed. Protections for children remain absolute.
This matters. Deepfake abuse is not hypothetical. It is already causing real harm, stripping people of control over their bodies and identities, and weaponising technology to intimidate, silence and shame.
But criminalisation alone is not enough.
Why this can’t stop here
As EVAW have made clear, laws that respond after harm has occurred will always fall short if prevention is not taken seriously. Survivors should not have to wait for their lives to be disrupted, their safety compromised, or their images spread before systems act.
Image-based abuse thrives in environments where:
Platforms are slow to act or profit from engagement
Survivors face barriers to takedowns and accountability
Education fails to reflect the realities of digital life
Specialist support services are underfunded and overstretched
This is why campaigners are calling for more than just criminal offences.
What’s still needed
Campaigners are urging the government to go further by committing to:
Civil routes to justice, including fast and enforceable takedown orders requiring images to be removed from platforms and perpetrators’ devices
Meaningful regulation of tech companies that host, enable or profit from image-based abuse
Relationships and sex education that reflects young people’s lived realities in the digital age
Sustained funding for specialist services that provide life-saving advocacy and survivor-led support
Earlier this week, a 73,000-strong petition calling for these changes was delivered to Downing Street.
Why this matters to Make Yourself Heard
At MYH, we recognise image-based abuse - including deepfake abuse - as part of a wider landscape of gendered harm, coercion and silencing. It intersects with sexual violence, domestic abuse, online harassment and systemic failures to protect victims.
Progress happens when survivors are listened to, believed, and backed by collective action. This new law exists because people refused to accept the status quo - and because survivors spoke out, even when the cost was high.
But progress is not the same as justice.
Justice means prevention. Accountability. Support. And systems that respond at the speed of harm - not years later.
Get involved
If you’ve been affected by image-based abuse or you’re working to challenge it, we want to hear from you. MYH x Noticeboard exists to share resources, campaigns and survivor-led calls for change.
Because silence protects perpetrators.
And collective voices are harder to ignore.
Source: Glamour UK
As of the 6th February 2026, it is a specific criminal offence to create or request the creation of intimate images of an adult, including intimate AI-generated ‘deepfake’ images, without consent or reasonable belief in consent.
Source: Crown Prosecution Service