BY: PHILIPPE DE JOCAS
In an age where we can jet around the world in the space of a day and meet with anyone at any time through the power of the internet, it seems hard to believe that our private thoughts remain a major concern. Social media allows people to gain insights into the thoughts and emotions of others, but what happens when those same thoughts turn self-destructive?
For a long time, suicide prevention has been largely a game of catch-up: an arms race designed to try and intercept individuals with suicidal or self-destructive thoughts before they can actually act on said impulses. Suicide hotlines, counselling, and mental health treatments are more reactionary than preventative – they can only swing into action and do their thing after the individual in question admits that they have a problem. With teen suicide rates hitting a 30 year high, preventative measures are more crucial than ever. For the first time, however, Facebook has revealed a new experimental technology that’s designed to better grab those at risk and redirect them towards the relevant mental health care faculties.
It’s no surprise that Facebook, since its inception, has grown into a cultural keystone in the fabric of the internet. Facebook, Twitter, and other sites are windows into the psyches of its users, and has often been associated with allowing authorities to target potential terrorists or at-risk individuals. To better join the fight against suicide, Facebook software engineers in the United States have deployed a brand new artificial intelligence designed to seek out certain keywords.
True artificial intelligence – a computer mind we can sit down and have a chat with – may still be far beyond our grasp, but computer engineers and roboticists have worked to try and replicate several capabilities of the human brain and embed them into contemporary software. Chief among these very human traits is pattern recognition. The internet remains a series of complicated and sometimes hard-to-grasp patterns. Software engineers focus on “teaching” programs to trawl through vast reams of data in search of relevant information. The latest suicide-prevention software actively searches for broad keywords that may indicate that a user may be at risk for suicide, tracking down at-risk users by searching for broad keywords that might indicate mental health problems. The more distress-related keywords or phrases an account uses, the higher the account rises in its category of at-risk users.
Mentions of angst and pain, for instance, would raise one flag. Responses from friends using phrases like “I’m worried about you” would be a second. Once the software has identified one or more posts as potential threats to the user or others around them, the software sends the user’s account details to the networks’ community operations team to plan the next step. Users might receive redirections to mental health or suicide prevention centres, or be offered one-on-one counselling from a local service provider.
In these turbulent and troubled times, maintaining mental health seems harder than ever. However, Facebook is here to show us that with the help of software and loving care, it may not be so daunting after all.