Twitter Has Cut Its Team That Monitors Child Sexual Abuse
Even as Elon Musk has said that removing child sexual exploitation content from Twitter was “Priority #1,” the teams charged with monitoring for, and subsequently removing such content have been reduced considerably since the tech entrepreneur took control of the social media platform. Bloomberg reported last month that there are now fewer than 10 people whose job it is to track such content – down from 20 at the start of the year.
Even more worrisome is that the Asia-Pacific division has just one full-time employee who is responsible for removing child sexual abuse material from Twitter.
“In this digital era, child sex trafficking and exploitation have become much more widespread and difficult to address. Criminals have become savvier about ways to avoid detection through the Internet. It is much easier to exploit children today than even 20 years ago,” warned Dr. Mellissa Withers, associate clinical professor of preventive medicine and director of the master of public health online program at the University of Southern California.
Multiple studies have found that the majority of teens spend at least four hours a day on electronic devices – and social media sites including Twitter, Instagram, and YouTube, could provide the perfect opportunity for a predator to identify potential victims with little risk of being caught.
“Victims may never meet their traffickers or abusers in person; they are groomed through social media, chat and gaming platforms,” added Withers.
Catfished
She explained that children and teens can fall prey to producing child sex abuse material (CSAM) through image and video-sharing platforms, and they may not even realize that the images they send can be used against them, or shared easily with others. In many cases, predators employ “catfishing” techniques where they pose as a teen and seek to gain the trust of their potential victims.
It was just last month that news circulated of a Virginia sheriff’s deputy who posed as a 17-year-old boy online and asked a teenage California girl for nude photos before he drove across the country and killed her mother and grandparents.
Sextortion
In other cases, it can be a form of “sextortion,” where the predator also manipulates the victim over time into sending nude photos.
“This eventually leads to harassment and threats to share the images unless money is sent,” said Withers. “Children are usually the victims of sextortion; one study found that 25% of victims were 13 or younger when they were first threatened and over two-thirds of sextortion victims were girls threatened before the age of 16 (Thorn, 2018).”
Is Twitter Failing Our Children?
Experts suggest it is very concerning that Twitter and other social media platforms are not doing their part to eliminate the CSAM materials that are spread through their platforms. The amount of voluminous data that needs to be scrubbed internally is substantial, and one or two people conducting that job should be seen as simply inefficient, even with external agencies assisting.
“Having a child safety team for online monitoring is critical for organizations operating on social media,” suggested Dr. Brian Gant, assistant professor of cybersecurity at Maryville University.
“In Twitter’s case most importantly because there is consensual pornography that is shared in large numbers on the platform,” Gant noted. “Not having an internal team to discern what is consensual, and what would be considered innocent images or child exploitation is paramount.”
The failure to act could be seen as enabling the predators to strike.
“Social media platforms are exacerbating child abuse when they allow users to condone pedophilia, exploitation, pornography, and other forms of abuse as well as enhancing the ability for children to be groomed, controlled, and exploited,” added Lois A. Ritter, associate professor for the masters of the public health program at the University of Nevada, Reno.
The reduction in the child safety team is thus seen with alarm.
“Social media platforms have a social and ethical responsibility to monitor the material on their sites to prevent and disrupt such horrific acts and prevent child victimization,” said Ritter. “Having staff monitor posts and follow up on complaints in a timely manner is critical. Unfortunately, profit often trumps child welfare. If this is a permanent staffing change, children will suffer.”
However, even with a large team of individuals, it could be impossible to monitor all the content on the platform.
“Automated technological tools can help but these should not take the place of a human moderator who will have to make decisions about what is real or not, or what is child sex abuse or not,” said Withers. “Maybe we need to hold these companies to a higher standard? They could be held responsible for creating an environment which allows for the proliferation of child sex abuse material.”
Of course, such content isn’t just spread on social media. It existed long before the Internet age.
“We should also remember that the United States is one of the largest producers and consumers of child abuse content in the world,” Withers continued. “We need to ask ourselves why and what we can do about reducing the demand for such content.”