Facebook working to prevent suicide on its live-streaming service

After several people have used Facebook's live option to commit suicide, the platform has created a way for users to report when they think someone may harm themselves.

Facebook will integrate real-time suicide prevention tools into Facebook Live on Wednesday to address the alarming phenomenon of people taking their own lives on its live-streaming service.

Suicide has surged to its highest level in nearly 30 years, according to the National Center for Health Statistics, adding urgency to the social media giant's efforts after two separate incidents were broadcast on Facebook Live in January.

In an attempt to report suicide or self-injury, live-chat support from crisis support organizations — such as the National Suicide Prevention Lifeline and the Crisis Text Line — will be available through Facebook Messenger.

Think your friend is in trouble? Facebook could help

If you see a disturbing post from a Facebook friend, you don’t have to leave the social network for help.

This week, Facebook’s Safety Center, a resource page with tips on bullying prevention and suicide hotlines, became available in more than 50 languages. Aside from informational pages, Facebook's report tool is ever-evolving. With the tool, users can flag friends’ posts to get advice on how to personally talk to that friend or ask Facebook to send that friend resources.

To use the tool, go to the concerning post, click the down arrow in the top right corner, select “Report Post” and click “I think it shouldn’t be on Facebook.” Then, select what is wrong with the post. For users concerned about a suicidal friend, they can select: “It’s threatening, violent or suicidal.” The following window asks to choose a type, where an option is “self-injury or suicide.”

A window then pops up, prompting users to contact a local authority immediately if they believe their friend is in danger. Plus, Facebook offers a list of next steps: “Reach out to a friend,” “Learn how to talk with Facebook Safety about this” and “Ask us to look at the post.”

When users select "reach out to a friend," they have the option to send a direct message to another friend with the concerning post and Facebook suggests this language: “Hey, this post makes me feel worried about [Name]. Do you have any idea why [Name] would have written this? Do you think there's something we can do to help?”

Selecting “learn how to talk with Facebook Safety” directs users to a Facebook resource page.

The third option, “Ask us to look at the post” will submit an inquiry to Facebook’s Help Team, which is staffed 24/7. Usually, the team responds to these reports within 48 hours, said Antigone Davis, Facebook's global head of safety.

If the team believes the post does indicate concern, it will send the Facebook user who posted it a message asking how Facebook can help. A page will pop up showing options: talk with a friend, contact a helpline (provides local contact information based on the user’s country) and get tips and support. The user won’t know who submitted the report.

“When somebody is worried about somebody, they aren’t sure what to say,” said Dan Reidenberg, executive director of SAVE. “The tool is incredible because it not only helps the person who is worried about somebody but it helps the person who is at risk.”

Facebook: Don't call us a media company
The tool, which was updated in June, is one part of Facebook’s initiative to safety. Last year, the network started posting Amber Alerts for missing children.

“This approach has created a community we’re proud of,” Davis said.

Facebook began working on suicide prevention ten years ago, and since has partnered with organizations including Forefront, SAVE, Suicide Prevention Lifeline and Samaritans.

Davis said the next step for Facebook is providing more resources specifically for parents.

© 2017 WXIA-TV


JOIN THE CONVERSATION

To find out more about Facebook commenting please read the
Conversation Guidelines and FAQs

Leave a Comment