Facebook’s New Algorithm Combs Posts to Identify Potentially Suicidal Users


Image: MOTHERBOARD
The site needs to strike a delicate balance between privacy and safety, but experts say it’s the right move.

By Kaleigh Rogers | MOTHERBOARD

In January, a 14-year old Miami girl broadcast her suicide on Facebook Live, to the horror of her friends and family. Just three weeks earlier, a 12-year-old in Georgia did the same thing on a site called Live.me, but the video began circulating on Facebook. And there have been multiple suicides that were predicated by a goodbye post on Facebook.

While Facebook has long had protocols in place to identify and reach out to potentially suicidal users, it recently upped the ante. This week, Facebook announced beefed up suicide prevention tools that use algorithms to scan posts and look for potentially suicidal language—posts about sadness, pain, and not being able to take it any more, for example—as well as take note of comments on posts that may signal an issue—things like „are you okay?“ and „I’m here for you.“

read more