Suicide prevention gets a new partner: Facebook

If you or anyone you know is having thoughts of self-harm, please reach out to organizations like the National Suicide Prevention Hotline or call their hotline directly at 1-800-273-TALK (8255). If you or someone you know is in immediate danger of self-harm, call 911 immediately. Your life is worth living and you don’t deserve to suffer.


Generally, my Facebook newsfeed is filled with silly photos of people waiting for the subway, declarations of love for a person’s partner, and snarky comments about current events. Sometimes, however, I see comments that make me concerned about someone’s mental health. I’ve been fortunate not to have seen anything that I felt needed to be reported, but I know that’s not the case for everyone.

Facebook announced yesterday that they are partnering with Forefront, Now Matters Now, the National Suicide Prevention Hotline, save.org, and other mental health organizations to create a more effective reporting program for people whose friends are expressing suicidal thoughts on Facebook.

When someone sees a friend's (let's call him Gerald) troubling post, they will have the option to report it directly to Facebook. Right now, at the upper right hand corner of every post, there’s a little downward arrow that, when you click on it, allows you to choose the option to report the post for potential suicidal content. (I haven’t been able to find screenshots of what that screen will look like, and the capability has not yet been activated on my account so I couldn’t make any of my own.)

The post will then be reviewed by “teams working around the world” to determine if the post does in fact imply that Gerald is in danger of self harming. If so, the next time Gerald logs into his account, he'll see this:

Facebook suicide prevention screen 1facebook_suicide_prevention_2facebook_suicide_prevention_3 Source: Huffington Post

One of the things that seems most promising is that Gerald doesn’t seem to have the option to dismiss these screens. He will have to at least click through the resources in order to get to their newsfeed. Hopefully, this will help reach some people who need help but aren’t able--for whatever reason--to ask for it or recieve it.

I also hope that Facebook is planning to critically evaluate this change. There are lots of unintended consequences that could arise from this new reporting system: a drop in posts containing potentially suicidal content, quick click-through speeds that imply users aren’t actually reading the resources, and gross misuse of the capability that floods the reviewing teams, making effective review difficult or impossible.

And when Facebook evaluates the initiative, I hope they make that information public. Because social media can provide a platform for mental health intervention, we need to know if a huge intervention like this is actually successful.

To learn more about this Facebook change, check out the Facebook Safety post explaining what’s happening.