AI driving 'explosion' of fake nudes as victims say the law is failing them
Campaigners are warning the use of artificial intelligence (AI) to create realistic but fake nude images of real women is becoming "normalised".
It's also an increasing concern in schools. A recent survey by Internet Matters found 13% of teenagers have had an experience with nude deepfakes, while the NSPCC told Sky News "a new harm is developing".
Ofcom will later this month introduce codes of practice for internet companies to clamp down on the illegal distribution of fake nudes, but Sky News has met two victims of this relatively new trend, who say the law needs to go further.
Earlier this year, social media influencer and former Love Island contestant, Cally Jane Beech, 33, was horrified when she discovered someone had used AI to turn an underwear brand photograph of her into a nude and it was being shared online.
The original image had been uploaded to a site that uses software to digitally transform a clothed picture into a naked picture.
She told Sky News: "It looked so realistic, like nobody but me would know. It was like looking at me, but also not me."
She added: "There shouldn't be such a thing. It's not a colouring book. It's not a bit of fun. It's people's identity and stripping their clothes off."
In November, Assistant Chief Constable Samantha Miller, of the National Police Chiefs' Council, addressed a committee of MPs on the issue and concluded "the system is failing", with a lack of capacity and inconsistency of practice across forces.
ACC Miller told the women and equalities committee she'd recently spoken to a campaigner who was in contact with 450 victims and "only two of them had a positive experience of policing".
The government says new legislation outlawing the generation of AI nudes is coming next year, although it is already illegal to make fake nudes of minors.
Meanwhile, the problem is growing with multiple apps available for the purpose of unclothing people in photographs. Anyone can become a victim, although it is nearly always women.
Professor Clare McGlynn, an expert in online harms, said: "We've seen an exponential rise in the use of sexually explicit deepfakes. For example, one of the largest, most notorious websites dedicated to this abuse receives about 14 million hits a month.
"These nudify apps are easy to get from the app store, they're advertised on Tik Tok, So, of course, young people are downloading them and using them. We've normalised the use of these nudify apps."
'Betrayed by my best friend'
Sky News spoke to "Jodie" (not her real name) from Cambridge who was tipped off by an anonymous email that she appeared to be in sex videos on a pornographic website.
"The images that I posted on Instagram and Facebook, which were fully clothed, were manipulated and turned into sexually explicit material," she said.
Jodie began to suspect someone she knew was posting pictures and encouraging people online to manipulate them.
Then she found a particular photograph, taken outside King's College in Cambridge, that only one person had.
It was her best friend, Alex Woolf. She had airdropped the picture to him alone.
Woolf, who once won BBC young composer of the year, was later convicted of offences against 15 women, mostly because of Jodie's perseverance and detective work.
Even then, his conviction only related to the offensive comments attached to the images, because while it's illegal to share images - it's not a crime to ask others to create them.
He was given a suspended sentence and ordered to pay £100 to each of his victims.
Jodie believes it's imperative new laws are introduced to outlaw making and soliciting these types of images.
"My abuse is not your fun," she said.
"Online abuse has the same effect psychologically that physical abuse does. I became suicidal, I wasn't able to trust those closest to me because I had been betrayed by my best friend. And the effect of that on a person is monumental."
'A scary, lonely place'
A survey in October by Teacher Tap found 7% of teachers answered yes to the question: "In the last 12 months, have you had an incident of a student using technology to create a fake sexually graphic image of a classmate?"
In their campaigning both Cally and Jodie have come across examples of schoolgirls being deep faked.
Cally said: "It is used as a form of bullying because they think it's funny. But it can have such a mental toll, and it must be a very scary and lonely place for a young girl to be dealing with that."
The NSPCC said it has had calls about nude deepfakes to its helpline.
The charity's policy manager for child safety online, Rani Govender, said the pictures can be used as "part of a grooming process" or as a form of blackmail, as well as being passed around by classmates "as a form of bullying and harassment".
"Children become scared, isolated and they worry they won't be believed that the images are created by someone else," Ms Govender said.
She added: "This is a new harm, and it is developing, and it will require new measures from the government with child protection as a priority."
Alex Davies-Jones, under-secretary of state for victims, told MPs in November: "We've committed to making an offence of creating a deepfake illegal and we will be legislating for that this session."
For campaigners like Jodie and Cally the new laws can't come soon enough. However, they worry they won't have strong enough clauses around banning the soliciting of content and ensuring images are removed once they've been discovered.
-SKY NEWS