How often do you see a person say something on Facebook that could be cause for concern, but you're not sure of the best way to help? Today, Facebook is addressing this problem in a major way, with three new suicide prevention tools.
While the social network has previously allowed you to report a friend's status for review, today's update offers more immediate ways to take action and show your support. First, Facebook says that it will try using artificial intelligence to identify posts associated with suicide or self harm. For these posts, a reporting tool will be "more prominent." In other words, you'll be able to spot and respond to a potentially dangerous post on your News Feed more quickly than before.
Second, Facebook is updating one of its newer products, Live. If something a friend says during their live stream concerns you, you can reach out in real-time, by messaging them directly or reporting the stream to Facebook. Resources for contacting a helpline or messaging a friend will pop up on the screen of the person filming.
Facebook is also expanding its work with various crisis support centers, such as the National Suicide Prevention Lifeline and the National Eating Disorders Association. In what will start as a test, users will be able to directly connect with and talk to someone from one of these organizations online.
As these updates roll out and are tested, it will be important to keep an eye on them. Are you noticing reporting notifications pop up where they shouldn't? If so, speak up. We're all part of the Facebook community and each of us has a responsibility to look out for each other, especially when there are sensitive issues related to mental health and safety involved.
If you or someone you know is considering self-harm, please get help. Call the National Suicide Prevention Lifeline at 1-800-273-8255.