The world’s biggest social networking is using artificial intelligence to identify suicidal users.
Facebook built a series of algorithms that identify warning signs in posts and comments. After a confirmation from a team from Facebook, the company gets in touch with those that might be suicidal offering ways they can use to get help.
According to sources, the algorithms are “trained” with posts samples which in the past were reported as being written by suicidal users, such as posts about pain, sadness, etc.
The artificial intelligence system is currently being tested in the United States of America and this marks the first use of artificial intelligence in analyzing Facebook posts, from the announcement made last month by Mark Zuckerberg, hoping to use algorithms to identify posts made by terrorists.
Facebook also announced new ways to approach suicidal behavior on Facebook Live and signed partnerships with multiple U.S. organizations specialized in providing help to people with mental issues to allow vulnerable users to contact them through the Messenger service.
Facebook provided help to suicidal users for years, but until now the social network relied on other users to bring this into attention with the help of the “Report” button.
This last effort to help user comes after the death of a 14 years old from Miami who committed suicide live on Facebook in January.
What do you think of Facebook’s efforts to help suicidal users ?