“Only people on our CSE team have seen those pictures,” Musk tweeted, referring to the company’s child sexual exploitation staff. “For now, we will delete those posts and reinstate the account.”
In fact, the image in question had drawn more than 3 million views and 18,000 retweets, according to Twitter statistics on a cached version of the tweet from Tuesday.
Experts said that even photos that are partially obscured typically qualify as illegal child sex abuse material, or CSAM.
“Generally speaking, even if it is redacted, if it’s clear it’s a child, it’s still CSAM,” said Gavin Portnoy, vice president of the National Center for Missing and Exploited Children, the congressionally chartered nonprofit that is the nation’s clearinghouse for battling child victimization.
Musk and Twitter did not respond to requests for comment.
The person whose Twitter account posted the image, Dominick McGee, said he’d done so to draw attention to child sex trafficking. The image was taken from one of the most notorious child abuse videos in the world, made by Peter Gerard Scully, an Australian sentenced in 2022 to life in prison plus 129 years for rape, human trafficking and the sexual abuse of children as young as 18 months.
McGee is popular among adherents of the QAnon web of baseless conspiracy theories, including the belief that child predators include or influence high-level government officials, especially Democrats. Musk reinstated banned QAnon accounts after buying Twitter in October. He is currently rebranding the company X and hopes to add commerce and payment services to the site’s discussion and messaging functions.
Asked if he regretted sharing the image, McGee said he believed he was suspended for posting about former president Barack Obama. “That excuse was a scapegoat,” he said by direct message.
Twitter’s “zero-tolerance” policy, last updated in 2020, says that “viewing, sharing, or linking to child sexual exploitation material contributes to the re-victimization of the depicted children” and is one of the platform’s “most serious violations.”
The policy says that the consequence for violation “in the majority of cases … is immediate and permanent suspension.” It also says that, in a “limited number of situations, where we haven’t identified any malicious intent,” the content will be removed and the user will be temporarily locked out of their account.
Musk has said that stopping the spread of CSAM on Twitter is one of his highest priorities, and has suggested that the company’s prior management was too busy censoring conservatives to root out exploiters.
Yet he has slashed trust and safety staffing, and even die-hard Musk fans have said problems continue. A self-styled victims advocate who uses the name Eliza Bleu and who last year hosted Musk on a podcast where she described him as leading the industry in fighting CSAM, tweeted in June that some videos of child exploitation have remained on the site for more than a month after being reported.
Last month, the Stanford Cyber Policy Center reported that Twitter had been letting through known CSAM that should have been caught with PhotoDNA, which identifies previously detected images and shares them with internet companies for blocking.
“It appeared that PhotoDNA, at least for some portion of material, was completely off and no one noticed it. It lasted for weeks and let tons of known CSAM through,” said David Thiel, chief technologist at the Stanford internet Observatory.
“It seems like they’ve been stumbling quite a bit, between detection mechanisms breaking and not having a content enforcement team that can rapidly address this particular issue,” he told The Post Thursday.
In cases not involving child abuse, Musk’s Twitter has been quick to suspend or ban accounts. The company has suspended the accounts of a college student who tracked his private jet, journalists who reported on those suspensions and the founder of an online court-filing database who was critical of Musk.
In contrast, some prominent far-right accounts on Twitter were recently rewarded with direct payouts said to represent shares of advertising revenue. Musk has also reinstated accounts that were suspended before his takeover, including former president Donald Trump, who has yet to tweet, and Andrew Anglin, the founder of a neo-Nazi website who has since been re-suspended.
The controversy could undermine Twitter’s ability to win back advertisers who have fled the social network due to concerns over Musk’s leadership and content moderation views. Earlier this month, Musk tweeted that the company’s advertising revenue had suffered a “50% drop.”
McGee disputed that he had shared a child sexual exploitation image and said Musk was “weird for allowing these Democrats to push a narrative that I shared child porn.”
Musk’s tweets helped bring attention to McGee, who tweets under the handle Dom Lucre and is known for sharing news items and conspiracy theories about child sex trafficking. Many of his past tweets have alluded to QAnon or Pizzagate, a conspiracy theory that alleged top Democrats were running a child sex-trafficking ring out of a Northwest Washington pizza parlor.
His Twitter account received more than 27,000 new followers on Thursday, compared to about 1,000 last Thursday, according to data from the social media analytics firm Social Blade.
Yoel Roth, the company’s former head of moderation and safety who resigned in November after Musk’s takeover, posted Wednesday on the Twitter competitor Bluesky that “it’s insane to write ‘we have zero tolerance for child sexual exploitation’ while also arbitrarily reinstating accounts that share” child sexual abuse material.
Roth fled his home late last year after Musk, in tweets to his more than 100 million followers, suggested Roth had encouraged children to access adult material online, a misrepresentation of Roth’s graduate-school writing that exposed him to online harassment and death threats.
“This guy blew up my life by saying I condone pedophilia, and then he turns around and does this,” Roth said on Bluesky.
McGee also shared the same child torture post on Instagram. The Instagram post, which had roughly 600 likes, was deleted Thursday. A spokeswoman for Meta, which owns Facebook and Instagram, said the image violated its policies against child sexual exploitation.