ADVERTISEMENT
ADVERTISEMENT

Google Is Making Some Important Changes To Its Autocomplete Suggestions

Google's autocomplete feature is there to guess what you're searching before you've finished typing. Sometimes it saves time, other times it's hilariously off the mark — but it should never be hateful. However, the company ran into that very issue when it came to two suggested searches that people were finding when they Googled "women" and "jews." For both terms, when the query began with "are women" or "are jews," Google suggested "evil." Or rather, Google's algorithm did. A representative told CNET that the suggestions are based on users' interests and search history. "Terms that appear in Autocomplete may be unexpected or unpleasant," the representative continued. "We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we acknowledge that autocomplete isn't an exact science and we're always working to improve our algorithms." The company confirmed to the Guardian and the Telegraph that the suggestions have since been removed. The issue with the suggestions isn't just that they're a jarring surprise in the middle of what was likely going to be a perfectly innocent query, but that they essentially encouraging users to search for and consume misogynistic and anti-semitic content. While people are free to Google whatever they like, let's not make bigotry any easier to access.

More from Tech

R29 Original Series

AdvertisementADVERTISEMENT
ADVERTISEMENT