Blog post

Suicide by Facebook

By Jack Santos | February 17, 2017 | 0 Comments

depressed-390938_960_720Today’s Wall Street Journal highlights a quote by Mark Zuckerberg:

“There have been terribly tragic events — like suicides, some live streamed — that perhaps could have been prevented if someone had realized what was happening and reported them sooner”

That hit a chord with me on two levels:  a friend who committed “suicide by Facebook”, and the ubiquity of algorithms.

Yes. You heard that right. I was directly affected by one of those events.  A longtime friend with whom  I’ve had both personal and professional relationships – I attended her “Boxing Day” parties, hired her clients (she was an IT recruiter).

We watched on Facebook, reached out, and were stunned by her very public decline into depression, which began shortly after the death of her partner. Were her facebook friends negligent at not reaching out enough?

Was her last post:

“… So many deadly pills in the house. ..I’m so tired of being lonely. ..going to sleep and not waking up to fight another battle sounds great.”

enough of a cry for help?

People responded, yet it still happened.

Facebook now wants to develop algorithms to catch these events, ahead of time.  It’s like putting a camera in your home, monitored by the authorities, and using that to prevent someone from walking out onto the bridge.

Gartner saw algorithms playing a major role in IT and business nearly five years ago, and every day our prognostication comes ever truer.

Yet there is much that we still have to learn about algorithms.  If you’ve called a call center to get a SNAFU fixed and get the answer “I have no idea why the computer did this and don’t know how to fix it” – there will probably be an algorithm behind that in the future.   But wait – it gets worse.  What if there is no way to easily correct the algorithm, no feedback that allows you to change your behavior, or anticipate a SNAFU, or a way to eliminate a bias in the algorithm?  We hear stories now about sophisticated crime analysis algorithms that exhibit racial profiling. Many corporate algorithms are proprietary and secret – intellectual property.  Yet the transparency of algorithms may be a major way to prevent algorithm errors and meltdown, or to game algorithm results.

The natural human tendency is to “trust the machine”, go with the crowd, don’t ask questions.  When algorithms become ubiquitous, that tendency could be deadly.  And when they are used to anticipate self destructive behavior?  It may work.  It may not.

How we think about algorithms, how we manage, monitor, and regulate algorithms – it’s all up for grabs.

In the meantime, Facebook will be watching you…and Gartner will be watching algorithms.

The Gartner Blog Network provides an opportunity for Gartner analysts to test ideas and move research forward. Because the content posted by Gartner analysts on this site does not undergo our standard editorial review, all comments or opinions expressed hereunder are those of the individual contributors and do not represent the views of Gartner, Inc. or its management.

Comments are closed