Technology Shouldn’t Enable Harm
AI-generated nudification is image-based sexual abuse
A BBC investigation exposes something many women already know too well: technology is being used to replicate harm, faster and at scale.
A woman described feeling “dehumanised and reduced to a sexual stereotype” after Grok, the AI tool linked to Elon Musk and the platform X, was used to digitally remove her clothes without consent. Not because she shared anything sexual. Not because she agreed. But because the technology allowed it - and the platform failed to stop it.
This is not “innovation gone wrong”. It is image-based sexual abuse, enabled by AI.
Despite policies that claim to prohibit this behaviour, non-consensual sexualised images continue to be created and circulated. As legal expert Clare McGlynn points out, platforms could prevent this - if they chose to. The lack of meaningful enforcement tells its own story.
Regulators say action is coming. The Home Office has proposed criminalising nudification tools. Ofcom says platforms must reduce the risk of illegal content. But for those already affected, harm isn’t theoretical. It’s personal, violating, and ongoing.
This is another layer in the wider pattern of abuse against women and girls - where consent is ignored, accountability is delayed, and responsibility is endlessly deflected.
If this has happened to you. You are not overreacting. This is abuse. Report it.
Report the image or content directly on the platform where it appears.
In the UK, non-consensual sexual deepfakes are illegal. You can report to the police via 101 or online.
Ofcom accepts reports where platforms fail to act on illegal content.
Get support
Revenge Porn Helpline (UK) - specialist support for image-based abuse
revengepornhelpline.org.uk
Rape Crisis England & Wales - emotional support and advocacy
rapecrisis.org.uk
Victim Support - free, confidential help
victimsupport.org.uk