Paedophiles are starting to use virtual reality headsets to view child abuse images, data shows
The NSPCC says the figures are "incredibly alarming but reflect just the tip of the iceberg" of what children are experiencing online.
Paedophiles are starting to use virtual reality headsets to view child abuse images, according to police data.
Use of this technology was recorded in eight cases in 2021/22 - the first time this technology has been specifically mentioned in crime reports.
During that period, police recorded 30,925 offences involving obscene images of children - the highest number logged by forces in England and Wales.
Of these, a social media or gaming site was recorded in 9,888 cases - including Snapchat 4,293 times, Facebook 1,361, Instagram 1,363 and WhatsApp 547.
NSPCC, which collated the data, is calling for a number of amendments to the Online Safety Bill to prevent more children becoming exposed to abuse.
Sir Peter Wanless, chief executive of the NSPCC, said: "These new figures are incredibly alarming but reflect just the tip of the iceberg of what children are experiencing online.
"We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children.
"By creating a child safety advocate that stands up for children and families the government can ensure the Online Safety Bill systemically prevents abuse."
The NSPCC also wants a change to the law that would mean senior managers of social media sites are held criminally liable if children are exposed to abuse.
Sir Peter said: "It would be inexcusable if in five years' time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media."
A government spokesperson said: "Protecting children is at the heart of the Online Safety Bill and we have included tough, world-leading measures to achieve that aim while ensuring the interests of children and families are represented through the children's commissioner.
"Virtual reality platforms are in scope and will be forced to keep children safe from exploitation and remove vile child abuse content.
"If companies fail to tackle this material effectively, they will face huge fines and could face criminal sanctions against their senior managers."
A spokesman for Meta - which owns Facebook, Instagram and WhatsApp - said: "This horrific content is banned on our apps, and we report instances of child sexual exploitation to NCMEC (National Centre for Missing & Exploited Children).
"We lead the industry in the development and use of technology to prevent and remove this content, and we work with the police, child safety experts and industry partners to tackle this societal issue.
"Our work in this area is never done, and we'll continue to do everything we can to keep this content off our apps."
A Snapchat spokesperson said: "Any sexual abuse of children is abhorrent and illegal. Snap has dedicated teams around the world working closely with the police, experts and industry partners to combat it.
"If we proactively detect or are made aware of any sexual content exploiting minors, we immediately remove it, delete the account, and report the offender to authorities. Snapchat has extra protections in place that make it difficult for younger users to be discovered and contacted by strangers."
-sky news