Google Sparks Outrage With Offensive, Racist Tags

Google's image-recognition algorithms made a whopper of a faux pas this week. Google's new Photos service mistakenly identified the selfies of two Black friends as "gorillas." Not cool, Google.

Google Photos lets you store unlimited photos online, along with photo-editing tools for tweaking them. The service also uses Google's artificial-intelligence smarts to identify objects and people in photos, so that you can go back and search for photos of food, or of your cat, or of people. In this case, the image recognition went horribly, horribly wrong.

Twitter user and computer programmer Jacky Alciné tweeted about the incident, asking, "What kind of sample image data you collected that would result in this son?" and "I understand HOW this happens; the problem is moreso on the WHY."

Google Photos' chief social architect, Yonatan Zunger, swiftly replied to the tweet, agreeing that "this is 100 percent not okay." Zunger asked for permission to examine the photos that prompted the racially charged identification result, and his team issued a fix to the problem in just a few hours. Now, Alciné and his friend are no longer identified as gorillas, but they still aren't identified as humans in the photo. Apparently, the app still has a lot of kinks getting worked out regarding facial recognition — Zunger said that the software used to tag people of all races as dogs.

In this particular case, the algorithm was having difficulty dealing with the faces because they were partially obscured, and because of the different processing needed for different skin tones and lighting situations. Seems like Google needs to take some lessons from Facebook's scary accurate facial recognition.

In a statement, Google told Ars Technica:

We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.

More from Tech