Internal Docs Reveal Twitter’s Porn Site Plans Were Derailed by Inability to Police Child Sexual Abuse Material

Twitter CEO Parag Agrawal
Kevin Dietsch/Getty Images

Twitter reportedly considered allowing porn stars to monetize their adult content via an OnlyFans-style subscription feature, but the plans were quickly shelved after internal teams found that the company is unable to effectively police child sexual abuse material (CSAM) already on its platform.

The Verge reports that earlier this year, Twitter was developing a new revenue idea — an OnlyFans-style subscription feature for pornography on its platform. The company quickly put together a specialized team, referred to as the “Red Team” to “pressure-test the decision to allow adult creators to monetize on the platform, by specifically focusing on what it would look like for Twitter to do this safely and responsibly.”

OnlyFans Porn Photoshoot

OnlyFans Porn Photoshoot (CRISTIAN HERNANDEZ /Getty)

However, the Red Team quickly discovered a major obstacle — Twitter was unable to allow porn performers to sell subscriptions as the company is still unable to effectively police harmful sexual content, including child sexual abuse material, on its platform.

In April 2022, the Red Team concluded: “Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale.” The company also did not have the tools to verify that creators and consumers of adult content were of legal age. As a result, in May the company delayed the project indefinitely.

Twiter spokesperson Katie Rosborough commented on the situation, stating: “Twitter has zero tolerance for child sexual exploitation. We aggressively fight online child sexual abuse and have invested significantly in technology and tools to enforce our policy. Our dedicated teams work to stay ahead of bad-faith actors and to help ensure we’re protecting minors from harm — both on and offline.”

However, fifteen months before the Red Team delivered the findings of its examination of the platform, researchers working on the team warned executives that Twitter’s tools for detecting child sexual exploitation material were ineffective.

“While the amount of CSE online has grown exponentially, Twitter’s investment in technologies to detect and manage the growth has not,” begins a February 2021 report from the company’s Health team. “Teams are managing the workload using legacy tools with known broken windows. In short (and outlined at length below), [content moderators] are keeping the ship afloat with limited-to-no-support from Health.”

Read more at the Verge here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan

COMMENTS

Please let us know if you're having issues with commenting.