Facebook Is Searching for Racial Bias in Algorithms

Mark Zuckerberg
Getty/Saul Loeb

Social media giant Facebook is reportedly forming new internal teams to study the algorithms that power both its social network and Instagram to determine if they are racially biased.

The Verge reports that social media giant Facebook plans to form internal teams to examine its algorithms across its various platforms to determine whether or not they are racially biased, specifically whether its algorithms that were trained using artificial intelligence negatively affect minority groups.

A Facebook spokesperson told the Verge that the team will be “tasked with ensuring fairness and equitable product development are present in everything we do.” The spokesperson added: “We will continue to work closely with Facebook’s Responsible AI team to ensure we are looking at potential biases across our respective platforms.”

The new “equity team,” as Instagram has labeled it, is a new move for the company which has previously resisted efforts to study the effects of racial bias across its social networks.

Vishal Shah, the vice president of product for Instagram, said in a statement given to the Verge:

The racial justice movement is a moment of real significance for our company. Any bias in our systems and policies runs counter to providing a platform for everyone to express themselves.

While we’re always working to create a more equitable experience, we are setting up additional efforts to continue this progress — from establishing the Instagram Equity Team to Facebook’s Inclusive Product Council.

The decision comes shortly after Instagram CEO Adam Mosseri promised to overhaul how the company tries to address issues that minority groups have with the platform. In a blog post, Mosseri wrote: “We’ve done a lot of work to better understand the impact our platform has on different groups, and that’s helped us get to where we are today. But I think there’s more to do across some key areas, which fit into our broader company commitments.”

The move also comes after the release of an 89-page civil rights audit of Facebook by the law firm Relman Colfax over the course of two which called on the company to eliminate racial bias in its AI-trained systems.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com

COMMENTS

Please let us know if you're having issues with commenting.