If Child Protection Is Musk’s ‘Top Priority,’ Why Is It So Easy To Find Users Selling Images Of Nude Minors On Twitter?

Published 6 months ago
Twitter Losing Its Most Active Users

ElonMusk has received praise for his attempts to tackle Twitter’s long-documented child exploitation problems. Yet Forbes finds many users who appear to be trading egregious illegal content.

Last week, influential podcaster Liz Wheeler, who describes herself as “unapologetically one of the conservative movement’s boldest voices,” took to Twitter to praise its new CEO, Elon Musk, for purging the site of “child pornography and child trafficking hashtags.” (Child advocates have long argued these terms indicate legitimacy and compliance on the part of the victim, and that such content should be called “child sexual abuse material.”)

“Think about how many little children he’s saving from sexual abuse, exploitation & torture,” she added. Responding to Wheeler’s comments on Tuesday, Musk tweeted, “This will forever be our top priority.”

Yet such illegal material remains alarmingly easy to find on Twitter—in multiple languages. Hashtags and search terms long related to child sexual abuse material (CSAM) still return lots of associated content. And recently departed Twitter employees, from the rank-and-file to the executive level, told Forbes the issue will be far more difficult to combat now that Musk has obliterated the teams charged with policing it, as well as their institutional knowledge.

As one such team member explained: “No one is left who is an expert on these issues and can do the work. . . . It’s a ghost town.”

“No one is left who is an expert on these issues and can do the work. . . . It’s a ghost town.”Recently departed Twitter executive

While it is true that some hashtags previously used by those trading child sexual abuse material (CSAM) appear to have been rendered useless—something for which Musk was recently celebrated—plenty have not.

In under an hour of searching on Tuesday, Forbes found dozens of users offering to buy and sell child sexual abuse images and videos. One offered footage of nude children ages 5 to 17. Another mentioned an exchange of images of children being abused on Telegram. Both these tweets, as well as nearly 30 others either promoting or seeking CSAM, were posted in the last week.

Simply searching the same words in banned hashtags revealed advertisements for much of the same content. One Twitter search for links to a well-known online storage platform returned a number of tweets offering videos of young boys and girls having sex with fathers and “with adult,” as well as images of “forced sex.”

Other questionable hashtags were still searchable. One containing an abbreviation long associated with CSAM returned ads, including one containing the image of what appeared to be an underage girl, offering an RFID tag that could be scanned and promised to take the user through to a CSAM-themed website. Forbes also found multiple Spanish-language users offering what appeared to be images of minor girls over on Telegram, with Spanish hashtags attached.

Such posts often had other hashtags attached. These could be clicked on to reveal yet more problematic content, such as a tweet with a cartoon depiction of a minor and male genitalia, with overlaid text asking for, “anything illegal or legal. This person is horny.”

Even some hashtags that were ostensibly banned before Musk took over Twitter returned a handful of problematic results. While no new posts were visible when searching one such hashtag, one popped up from June in which a user said they were looking for images of newborns. That user was still on Twitter as of Tuesday. With only three posts, all carrying the same text and image, it’s possible they were a bot, another bugbear Musk has promised to tackle. It’s unclear why the account remains online. Twitter did not respond to a request for comment.

“I don’t see any meaningful action taken by Musk so far.”Carolina Christofoletti, CSAM threat analyst for TRM Labs

Experts who have long studied the pervasiveness of this material on social media have warned that reducing Twitter’s CSAM problem to a handful of hashtags is a gross oversimplification. They say Musk is essentially being applauded for something he hasn’t done, or even tried to do.

“I don’t see any meaningful action taken by Musk so far,” said Carolina Christofoletti, a CSAM threat analyst for TRM Labs and researcher at the University of São Paulo in Brazil. She has cautioned that the issue “is far bigger than a bunch of hashtags” and “much bigger than ‘easy to catch’ fishes.” She noted that the wiping of some troubling CSAM hashtags and other associated actions “were all things done under the previous leadership.”

“He has drained the child safety team on Twitter without any risk impact whatsoever,” she added, pointing to a Musk tweet asking users—rather than his internal team—for ideas on how to better tackle CSAM. “Twitter is exactly how it was, nothing new under the sun here,” she wrote in a LinkedIn post. “The CSAM networks are still there, as active as before.”

Meanwhile, the National Center on Sexual Exploitation is warning that Musk’s reported plans to move forward with a paywalled video feature, akin to OnlyFans, would only exacerbate the already urgent children’s safety issues on the site. “Musk cannot confront child sexual abuse material on Twitter while creating another means that would only fuel sexual abuse and exploitation on the platform,” Lina Nealon, director of corporate and strategic initiatives, said in a public statement Tuesday.

At least one person who remains at the company said the leadership shakeup has not changed Twitter’s policy on, or employees’ commitment to, the issue. A current child safety specialist said Musk is “fully aware” of Twitter’s “zero tolerance” approach to child exploitation material and that “our team is working 24/7” across three regions—San Francisco, Singapore and Dublin—to keep children safe.

“I’m proud of the work we do for our team,” she wrote to Forbes over LinkedIn. “And despite all the outside/internal noise, our team is focused on our work.”

By Thomas Brewster, Forbes Staff and Alexandra S. Levine Forbes Staff