Facebook Is Increasing Its Efforts To Detect Suicidal Thoughts

Photo: Courtesy of Facebook.
Recently, the Centers for Disease Control and Prevention reported a worrying statistic: Between 2007 and 2015, suicide rates among teenage girls in the U.S. doubled from 2.4 to 5.1 per 100,000 people — a 40-year high. Although that report did not cite a specific cause for this alarming rise, other studies have pointed to social media and smartphone use, where cyberbullying has resulted in the creation of a new term: cyberbullicide.
At their core, social networking sites were created to connect people with friends and family. That network also means that platforms can potentially be a powerful tool for counteracting cyberbullying or providing support for a person at risk.
Today, Facebook announced a few new efforts to proactively detect suicidal posts and, hopefully, get people help faster.
First, Facebook is expanding its use of artificial intelligence to identify posts and live streams that may include suicidal thoughts. The AI technology, which the company first tried using as a suicide prevention tool in March, is able to parse videos and text more quickly than someone can report something. It picks up on phrases such as "are you ok?" and "can I help?" that can signal someone may be at risk. This AI is also being used to prioritize which posts may be more at risk than others.
After the AI finds something, a member of the global Community Operations team will take a look. This is another area Facebook is improving upon: Increasing the number of trained reviewers, and introducing automated tools for reaching out to first responders who can get in touch and provide help on the ground.
"Every minute counts when you do this kind of work," Guy Rosen, Facebook's vice president of product management told Refinery29. "This is really about working fast so we can get people help in real-time." According to Rosen, in the last month, proactive detection has resulted in 100 wellness, or on-the-ground, checks.
These resources are in addition to the reporting tools already available: If you're concerned by a friend's post, click the "report" link, select the appropriate issue, and tap "send." From there, a reviewer will take a look. Your report is confidential, though you'll also see additional information about how to reach out and help someone yourself.
Facebook isn't the only tech company seeking new ways to provide help. Earlier this year, Instagram launched its #HereForYou campaign to create a community for those affected by mental illness. Crisis Text Line, meanwhile, offers immediate assistance via the most accessible means for anyone with a smartphone: A text.
These tools are by no means a fix for the issues that have arisen in tandem with the expansion and proliferation of social media, but it is an additional resource on the platforms that very well could be impacting young people's health.
If you or someone you know is considering self-harm, please get help. Call the National Suicide Prevention Lifeline at 1-800-273-8255.

More from Tech

R29 Original Series