Instagram facilitates ‘vast pedophile network’ on its platform, bombshell report finds
(LifeSiteNews) — Instagram is facilitating large pedophile and child-trafficking networks, according to a bombshell report by The Wall Street Journal (WSJ) published June 7.
Researchers from Stanford University and the University of Massachusetts Amherst together with the WSJ investigated pedophile activities on Instagram and found that “Instagram doesn’t merely host these activities. Its algorithms promote them.”
“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found,” the WSJ report states.
Instagram allows users to search for hashtags used by pedophiles to connect with accounts that advertise and sell child-sex material, including images, videos, and in-person “meetings.”
Alex Stamos, who was Meta’s (Instagram’s parent company) chief security officer until 2018 and is now the director of the Stanford Internet Observatory (SIO), noted, “That a team of three academics with limited access could find such a huge network should set off alarms at Meta.”
“I hope the company reinvests in human investigators,” Stamos added.
According to the WSJ, the researcher set up test accounts that, after viewing a single account from a pedophile network, instantly received recommendations from buyers and sellers of “child-sex-content.”
“Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children,” the report stated.
The SIO researchers found 405 sellers of what they described as “self-generated child-sex material,” meaning accounts that are allegedly run by the children themselves, some claiming to be as young as 12.
The WSJ wrote that many of these accounts “show users with cutting scars on the inside of their arms or thighs, and a number of them cite past sexual abuse.”
“According to data gathered via Maltego, a network mapping software, 112 of those seller accounts collectively had 22,000 unique followers,” the WSJ reported.
Next to the accounts that actively buy and sell child sex content, many users in the “pedophile community” share pro-pedophilia memes or talk about possibilities to get access to minors.
“Current and former Meta employees who have worked on Instagram child-safety initiatives estimate the number of accounts that exist primarily to follow such content is in the high hundreds of thousands, if not millions,” the WSJ wrote.
Meta has repeatedly stressed that they work to remove these users. A spokesperson claimed that they removed almost 500,000 such accounts in January alone for violation of its child safety policy.
The Stanford researchers found that Meta has struggled more to remove child-sex content than over social media platforms “both because of weak enforcement and design features that promote content discovery of legal as well as illicit material.”
Instagram’s algorithms that determine what content is shown to users works in such a way that “[e]ven glancing contact with an account in Instagram’s pedophile community can trigger the platform to begin recommending that users join it.”
These algorithms have also allowed users to search for pedophile terms, even though they recognize that these search terms might be associated with illegal material.
“In such cases, a pop-up screen for users warned that ‘These results may contain images of child sexual abuse,’ and noted that production and consumption of such material causes ‘extreme harm’ to children,” the WSJ reported.
The pop-up window allowed the user to view the search results anyway by offering the two options “Get resources” or “See results anyway.”
The WSJ report noted that Meta has suppressed such account networks before, like those connected with the alleged “insurrection” after the events of January 6, 2021. It should therefore not be too difficult for the social media giant to suppress pedophilic content if Meta were as invested in child protection as in censoring controversial political opinions.
A Meta spokesperson said that systems to prevent recommendations of child-sex-content are currently being developed.
The head of the UMass Resue Lab, a research organization that aims to prevent child victimization online, Brian Levine, called Instagram’s part in the promotion of pedophile accounts and content “unacceptable.”
“Pull the emergency brake,” Levin urged Meta. “Are the economic benefits worth the harms to these children?”