Child abuse material found on VR headsets, police data shows
Paedophiles are using VR headsets to view and store child abuse imagery, crime figures show for the first time.
The NSPCC obtained the data after a Freedom of Information request to all 45 forces in the UK about numbers of child abuse image offences.
It found forces had recorded eight offences involving headsets and VR.
The charity is warning the growing use of virtual reality headsets to explore the so-called Metaverse exposes children to new risks online.
Sir Peter Wanless, the NSPCC’s chief executive, said: “We hear from young people who feel powerless and let down as sexual abuse risks becoming normalised.”
VR headsets allow access to a variety of virtual games, chat rooms and experiences, sometimes known as the “Metaverse”.
Mark Zuckerberg founded Meta, embracing the idea of the Metaverse. He believes VR is an important part of the company’s future and has invested billions in the technology.
The UK government expects the spending on virtual and augmented reality technology to reach more than £60bn by 2030.
Catherine Allen is an expert on VR and CEO of an immersive technology company, Limina Immersive.
She said: “This is an emerging, fast growing threat that politicians and technology companies need to take seriously.
“Online offenders will flock to places where there is little scrutiny or regulation and we can see this happening in VR.”
The government says that VR headsets and the Metaverse are covered by the Online Safety Bill, which is going through the Lords at the moment.
A spokesman from the Department for Digital, Culture, Media and Sport said if platforms failed to protect children, “companies will face huge fines and could face criminal sanctions against senior managers”.
The figures involving VR are, however, small compared to the overall picture.
They showed a record 30,925 number of offences were committed in the year 2021/2022, involving the possession and sharing of indecent images of children.
The NSPCC warned that “unregulated social media is fuelling the unprecedented scale” of the problem.
Sir Peter said the figures were “incredibly alarming”, but “reflect just the tip of the iceberg of what children are experiencing online”.
Social media or gaming sites were named in 9,888 offences.
Snapchat was named in 4,293 offences, Instagram in 1,363, Facebook in 1,361, and messaging platform WhatsApp in 547.
From its inception, VR and augmented reality has been used legally in the world of commercial adult sex work.
It has been argued that it would only be a matter of time before the same technology was used to groom and sexually exploit children, as well as to share illegal content.
The BBC first found in 2017 that VR headsets were being used to sexually exploit children.
In this instance, a man based in Egypt was advertising the sale of child abuse images and videos online.
He offered that material in VR, with a price tag of $160 (£132). He claimed that the footage was shot using a 360 degree camera, and offered “technical support”.
In 2022, the BBC reported that a Metaverse app allowed children to enter strip clubs.
The NSPCC is calling on the government to create a statutory child safety advocate through the Online Safety Bill.
A spokesman for the Department of Culture Media and Sport said the Bill included “tough, world leading measures” to protect children.
In a statement, Snapchat said: “Snap has dedicated teams around the world working closely with police, experts and industry partners.”
It added that if sexual content exploiting children is discovered, “we immediately remove it, delete the account and report the offender to the authorities”.
Meta, which owns Facebook, Instagram, WhatsApp and the Meta Quest headset, said: “This horrific content is banned on our apps.”
“We lead the industry in the development and use of technology to prevent and remove this content,” a spokesperson added.