Twitter Users Call Out Racial Bias In The Platform’s Photo Preview Generator

Photo: David Paul Morris/Bloomberg/Getty Images.
Over the weekend, you may have noticed a lot of white faces on your Twitter feed. Like, more than usual. That's because many users discovered an issue with the platform's photo preview generator. As in so many areas of tech and media, Twitter's photo preview generator appears to have a racial bias problem.
Twitter users demonstrated that if you tweet a photo that features a white person and a Black person, no matter how the photos are laid out, the platform's photo preview generator favors the white face, which means that previews more often show the white person instead of the Black person. Here are a couple of example tweets. Click on the photos to see them in full, and you'll understand the issue.
Interestingly, the issue first came to light when a Twitter user posted about facial recognition bias on another tech platform, The Verge reports@colinmadland tweeted about Zoom not showing his Black colleague's face on video calls. The screenshots he shared on Twitter as evidence of the Zoom issue automatically populated his white face in the preview instead of his colleague's.

According to a statement from Twitter, the platform was previously unaware of the bias issue. "Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing," a Twitter spokesperson tells Refinery29. "But it’s clear from these examples that we’ve got more analysis to do. We'll continue to share what we learn, what actions we take, and will open source our analysis so others can review and replicate." Additionally, Twitter's chief data officer and chief technology officer have been interacting with users conducting various tests on the platform, and an expert at Carnegie Mellon has shared his independent analysis. We'll have to wait and see what changes come of the experiments.

More from Tech

R29 Original Series