ADVERTISEMENT
ADVERTISEMENT

Facebook Wants To Stop Revenge Porn Before It Begins, But The Process Is Tricky

Photographed by Erin Yamagata.
As soon as someone shares a photo of you online, it can have a ripple effect: Others can easily share it, reproduce it, even modify it. Of course, many times, those shared photos — ones of you smiling at a wedding or a graduation — are harmless. But for survivors of revenge porn, the nonconsensual sharing of intimate images, that ripple effect can prove disastrous, with consequences that last far longer than the seconds it took for someone to post the photo.
Last year, Facebook sought to curb this ripple effect. In addition to introducing new reporting tools, the company came up with a way to create what's known as a hash, or digital fingerprint, of a photo. This hash detects any future attempts to share the same photo, and prevents a user from doing so.
AdvertisementADVERTISEMENT
This has allowed Facebook to prevent the continued posting of thousands of non-consensually shared intimate images. But there was always a problem with this solve: The process does nothing to prevent the first sharing of the photo, and "once it’s shared to communities you care about, the devastation and initial harm has already happened," Antigone Davis, Facebook's Global Head of Safety, told Refinery29.
Today, Facebook is launching a small pilot program in four countries — Australia, Canada, the U.K. and U.S. — intended to target that initial share on your News Feed, Instagram feed, or Messenger. (According to Sky News, the pilot program has been running in Australia since last year.)
Here's how it works: If you're concerned someone might post an intimate image of you, you can go to the site of one of Facebook's partners on this pilot — the Cyber Civil Rights Initiative and National Network to End Domestic Violence in the US; the Australian eSafety Commissioner; the UK Revenge Porn Helpline; and YWCA Canada — and submit a form. Then, you'll receive an email with an encrypted link, where you can upload the image(s). A team of five, specially trained Facebook reviewers will receive the link, create hashes of the images, and then delete them from Facebook's servers within seven days. If someone does try to share one of the images on any Facebook-owned platform, be it Instagram, Facebook, or Messenger, they'll be blocked from doing so.
Facebook is aware that this puts people in a difficult position.
AdvertisementADVERTISEMENT
"The obvious, big challenge is we're asking people to send an image that hasn’t been shared publicly, but they’re just concerned will be shared in some way," Davis said. "It’s very hard — these images were shared most likely in an intimate relationship, although maybe in another intimate setting, and now [the user is] in a position of having to share it with people they don’t know. One of the things we want to figure out is how can we minimize that challeenge for the individual and make it the most sensitive, least intrusive approach."
Users who have been threatened with the sharing of nude photos will have to weigh the potential outcomes: Having the threat carried out, in which case their friends and family might see the image, or sharing it ahead of time with Facebook, in which case the team of five unknown reviewers will see the image, but delete it before it ever hits their News Feed. Neither outcome is an ideal situation, and the latter can feel counterintuitive.
Davis said Facebook is taking additional safety precautions. Besides requiring people to go through a partner organization to provide the initial request and providing an encrypted link, the company will only store the hash of the images in a "media match service bank". When someone tries to post a photo, Facebook's algorithms will look for any matches between that photo and the digital fingerprints in the service bank.
Although compared to say something like spam, nonconsensual image sharing is "a low volume problem" — the potential for harm is significant, Davis said. The statistics back her up: According to the Cyber Civil Rights Initiative, 82% of survivors say they suffer "impairment in social, occupational, or other important areas of functioning," 49% say they have been harassed by others who see the images, and 38% say their relationships with friends have suffered as a result. The majority of survivors are women between the ages of 18 and 30.
"I think what’s worth trying, because there’s a desire for it — there are victims who work with the partner organizaitons that have been threatened with sharing, so it's responding to a need," Danielle Keats Citron, a law professor at the University of Maryland Carey School of Law and author of Hate Crimes in Cyberspace, told Refinery29. Citron has advised Twitter and Facebook on policies around intimate image sharing. "I’m comfortable with what they’re doing because I know how hard they’re working on the security."
AdvertisementADVERTISEMENT

More from Tech

ADVERTISEMENT