Snapchat flagged in nearly half of child abuse imagery crimes in past year

Freedom of information requests showed Facebook, Instagram and WhatsApp were named in around a quarter of cases of child abuse imagery crimes when police linked the case with an online platform.

Snapchat flagged in nearly half of child abuse imagery crimes in past year

Snapchat was flagged in nearly half of the crimes involving child abuse imagery over the past year, new figures reveal.

Freedom of information requests submitted by the NSPCC children's charity to 35 police forces showed Facebook, Instagram and WhatsApp were named in around a quarter of cases of child abuse imagery crimes when police linked the case with an online platform.

The figures also show the number of child abuse image crimes recorded by UK police forces increased by 25% in a year, with a total of 160,000 offences recorded since 2017.

The Online Safety Act, which was passed into law last year, aims to make social media companies more responsible for the content published on their platforms.

Ofcom, the regulator, is drawing up guidelines for how the laws will be enforced, but there are concerns delays in enforcement could mean it takes years before the measures are implemented.

A 14-year-old girl who was tricked by an adult into sending nude images told the NSPCC's ChildLine counselling service: "One night I got chatting with this guy online who I'd never met and he made me feel so good about myself.

"He told me he was 15, even though deep down I didn't believe him. I sent him a couple of semi-nudes on Snap[chat], but then instantly regretted it."

Met Police 'failing to deal with child sexual exploitation', report says

The IWF said it investigated a record 392,660 reports of suspected child abuse imagery last year - 5% more than in 2022.

Around one in five (54,250) of the websites found to contain child abuse included the most severe form, known as Category A.

More than 90% of child sexual abuse images on internet are 'self-generated'

'A truly disturbing picture'

Susie Hargreaves, chief executive of the Internet Watch Foundation charity, said: "This is a truly disturbing picture, and a reflection of the growing scale of the availability, and demand, for images and videos of children suffering sexual abuse.

"That more and more people are trying to share and spread this material shows we should all be doing everything we can to stop this, building more, and innovative solutions to keep children safe.

"The IWF is ready to support technology companies and Ofcom in implementing the Online Safety Act to help make the UK the safest place in the world to be online."

In a statement, a Snapchat spokesperson said: "Child sexual abuse is horrific and has no place on Snapchat.

"We use cutting-edge detection technology to find and remove this type of content, and work with police to support their investigations.

"Snapchat also has extra safety features for 13 to17-year-olds, including pop-up warnings if they're contacted by someone they don't know."

-sky news