Deepfake porn abuse: ‘we must combat the misogynistic culture which allows this to thrive’, BBC presenter says

The government has announced that it is going to create a stand-alone offence for creating a sexually explicit deepfake without permission.
Watch more of our videos on Shots! 
and live on Freeview channel 276
Visit Shots! now

“Unfortunately with work, I’ve seen a lot of people requesting deepfakes of their family members,” Jess Davies pauses briefly to let that sink in.

This is not using the faceswapper app to make a funny video of mums or sisters, but creating hardcore porn of family members for twisted sexual gratification or blackmail. And despite the news stories about faked AI images and videos of Donald Trump or Tom Cruise, 97% of all deepfake content online is of a sexual nature.

Hide Ad
Hide Ad

There are forums and services that produce custom-made deepfake porn of normal people, as well as celebrities. These websites get millions of views a month, and creators are making big sums of money. Most are unrepentant.

Davies continues: “I’ve seen men request deepfakes of their mums, teenage boys requesting deepfakes of their teachers. I’ve seen someone request a deepfake of their sister and specifically say they’re going to use it to blackmail them.” I feel a chill going through to my bones.

Davies used to be a model, and had her own images stolen and abused. She’s now a broadcaster and campaigner, and investigated the epidemic of deepfake abuse for the BBC. She’s interviewed a US state senator who was blackmailed with AI porn and a woman who found out her colleague had created deepfakes of her.

She tells me: “In the work that I have done, the creation of this content really has damaging effects on the victims. Just being able to create that [deepfake] hangs over as a threat, you may distribute it at any time. These women may be thinking, is this something that is going to happen to me, is my digital footprint going to be altered forever.”

Hide Ad
Hide Ad
Broadcaster and campaigner Jess Davies. Credit: Jess Davies/Kim Mogg/AdobeBroadcaster and campaigner Jess Davies. Credit: Jess Davies/Kim Mogg/Adobe
Broadcaster and campaigner Jess Davies. Credit: Jess Davies/Kim Mogg/Adobe

I’m speaking to her after the government has announced that it is going to create a stand-alone offence for creating a sexually explicit deepfake without permission. Those who generate these images will face a criminal record and an unlimited fine, and if the AI content is shared more widely offenders could be sent to prison.

“I think it’s a great step towards trying to combat this, and combat the misogynistic culture which allows this to thrive which is what we really need to look at,” Davies tells me. She’s concerned, not just about deepfake crimes, but offenders moving their abuse from online into the real world. Police have found evidence that virtual crimes can embolden sexual offenders to target real people.

“For me, it’s looking at what that actually does to the perpetrator that’s creating that,” she explains. “We know that it’s a lot of teenage boys who are accessing this technology and these sites. If you have access to create porn, effectively, of your family members, of your friends, your colleagues, your teachers. 

“What is that doing to the mental psyche of young men and boys who feel an entitlement to women’s bodies? How does that cross over into real life consequences and how they view their entitlement to women’s bodies and consent. So for me, it’s really important that the creation is made an offence in the UK and across the world.”

Hide Ad
Hide Ad
Jess Davies outside Parliament. Credit: Jess DaviesJess Davies outside Parliament. Credit: Jess Davies
Jess Davies outside Parliament. Credit: Jess Davies

Worryingly it appears a lot of young boys and men don’t understand the impact this has on women. They see it as a victimless crime. “A lot of the times when I’ve posted about this, I’ve received a lot of negative comments from men and boys who are very blaise and say it’s not real, women are overreacting,” Davies tells me.

“A lot of men are putting their own sexual gratification above women’s entitlement to their own bodies, and they don’t think consent is required. That’s a dangerous reflection of the real world, where a lot of men don’t think that consent is required.”

Former Love Island contestant Cally Jane Beech is one of the hundreds of celebrities who has become victim to sexually explicit deepfakes. She told a Parliamentary roundtable: “My privacy, dignity, and identity were compromised by the malicious use of artificial intelligence. I became a victim of AI deep fakes. A mere photograph, innocently taken, had been distorted.

“My underwear was stripped away, and a nude body was imposed in its place and uploaded onto the internet without my consent. The likeness of this image was so realistic that anyone with fresh eyes would assume this AI generated image was real when it wasn’t, nevertheless I still felt extremely violated.”

Hide Ad
Hide Ad
Cally Jane Beech. Credit: GettyCally Jane Beech. Credit: Getty
Cally Jane Beech. Credit: Getty

When investigating the issue, Channel 4 presenter Cathy Newman found sexually explicit deepfakes of her had been created. She wrote in the Times: “The nature of my job means I’ve become accustomed to watching disturbing footage and, having been repeatedly trolled online, I consider myself pretty resilient. I therefore thought that I would be relatively untroubled by coming face to face with my own deepfake. The truth was somewhat different.”

“Most of the ‘film’ was too explicit to show on television,” she said. “I wanted to look away but I forced myself to watch every second of it. And the longer I watched, the more disturbed I became. I felt utterly dehumanised.

“That an unknown perpetrator has used readily available technology to fantasise about forcing me into an array of sexual activities can only be described as a violation. Since viewing the video last month I have found my mind repeatedly returning to it. It’s haunting, not least because whoever has abused me in this way is out of reach, faceless and therefore far beyond the limited sanctions presently available.”

Davies is also worried about how practical it will be to track down perpetrators. Firstly, she’s concerned that there’s a potential loophole in the law, as the offender has to have intended to “cause alarm, distress or humiliation” when making the content.

Hide Ad
Hide Ad

“We know from other offences like this, for example the cyber flashing offence where there’s this clause of having to prove intent, that proves quite difficult to prove in a criminal court,” Davies tells me.

To get to court, police have to track down the faceless perpetrators hiding behind a screen. That’s the most difficult part, and Davies doesn’t think officers have the training or the resources for this.

“Unfortunately I’ve had a lot of woman contact me about their private real images being shared without consent,” she says. “They had a negative response, because the police said they didn’t have the technology to track these people down. That’s a big concern. If they can’t deal with real life images how are they going to deal with deepfakes. 

“The offence is a welcome step, but now it’s time for the government to look at how they can help the police to tackle this.” She sighs: “There’s still a long way to go.”

Hide Ad
Hide Ad

You can watch Jess Davies’ documentary Deepfake Porn: Could You Be Next? on BBC iPlayer.

Ralph Blackburn is NationalWorld’s politics editor based in Westminster, where he gets special access to Parliament, MPs and government briefings. If you liked this article you can follow Ralph on X (Twitter) here and sign up to his free weekly newsletter Politics Uncovered, which brings you the latest analysis and gossip from Westminster every Sunday morning.