Goodbye Samaritans Radar… hello Facebook report button

There’s no denying that suicide is a very real and serious problem. In 2013 over six thousand people in the UK took their own lives, a tragic figure which suicide prevention programmes rightly hope to reduce. But you might be surprised at just how common suicidal thoughts are among people who don’t go on to act on them. Research suggests that up to 17% of us will consider suicide at some point – that’s almost 11 million people in this country alone. I’m one of them; maybe you are too.

It’s this that has prompted Facebook, the world’s biggest social media platform, to introduce its own suicide prevention tool (interestingly Radar, the disastrous attempt at such a tool from the Samaritans, has finally been put to bed today). It makes sense; as more and more of us spend our lives online, more and more people are talking about suicidal thoughts in their blogs or on social media. The plan, already in place in the US, is to introduce an option to report a suicidal comment and have that report reviewed by a third party. If the report is deemed worrying, next time that user logs in they will be greeted with a pop up box saying, “Hi [name], a friend thinks you might be going through something difficult and asked us to look at your recent post.” The user is then offered the choice of being given someone to talk to (a suicide hotline) or to be offered “tips and support”.

So what’s not to like? Firstly, the online mental health community needs to remain a safe space. Although most people who have suicidal thoughts never act on them, the experience itself can be frightening and isolating. Thoughts of taking your own life can seem too shocking to share with your nearest and dearest, which is why the anonymity of social media can be so valuable. Developing internet friendships with people experiencing similar issues can be a genuine lifeline, and the online environment offers people the chance to speak the truth about their feelings without worrying loved ones. Being able to honest is vital. With the reporting structure in place I fear people will begin self-censoring, frightened their words will prompt someone to hit that button.

And when people talk about suicide there is a great deal of complexity and nuance. Suicidal thoughts can be fleeting and nebulous, for example, wondering if friends and family might be better off without you. Some people may express a desire to “not be”, perhaps posting thoughts such, “I just want to go to sleep and never wake up” – but with no real indication that they are planning to end their life. These kinds of statements are incredibly common within the online mental health community. I’ve said them many times myself. But most users will not know this and may well hit the report button in good faith.

Let’s not forget it’s not only your Facebook friends involved in the process. I’d particularly like to know more about “third parties” charged with reviewing the reports. Who are they? What training will they be given? Will they understand how common suicidal thoughts are, how often people express them, and how the online mental health community uses social media? I can only imagine the support they can offer support, inevitably, be limited. Almost anyone expressing suicidal feelings is likely to be aware of services like the Samaritans, while offering someone who feels suicidal a list of generic hints and tips feels almost derisory.

Finally, it’s invasive. The user can hit skip, but cannot opt out of the system so they don’t get presented with the dialogue box (perhaps again and again). It has all the subtlety of the Annoying Paperclip that graced Microsoft Word some years ago. “Hi! It looks like you’re writing a letter!” “Hi! It looks like you’re going through something difficult!” No shit, Sherlock. Whilst nowhere as intrusive as last year’s disastrous Samaritans Rader tool, this still gives the power to the button hitter and it is for the recipient to have to dismiss the dialogue box – without knowing who has reported them.

Some might say this isn’t meant for the adult online mental health community, but there’s your real problem. In the pool of online suicide references, the vast majority will come from that community. Maybe Facebook is thinking of younger users expressing fist tentative suicidal feelings, but that’s not unproblematic either. There is clear potential for people, young or old, to have malicious reports made about them. Imagine being faced with the paperclip, sorry dialogue box, every time you log in to Facebook. My fear is that will drive people off the platform, disconnecting them from their friends and their coping mechanism. And of all the suicidal people I meet online, it’s the ones who disappear I really worry about. This move looks like a great way for Facebook to generate goodwill, but their invasive tool might just hurt the people it wants to help.


About purplepersuasion

40 something service user, activist, writer and mother living with bipolar disorder. Proud winner of the Mark Hanson Prize for Digital Media at the Mind Media Awards #VMGMindAwards 2013. Winner of the World in Mentalists Mood Disorder blog 2012. Regular guest blogger for the International Bipolar Foundation Expert by Experience working with Mind training department. Working on The Incoming Tide, a bipolar memoir. Find me on Twitter @BipolarBlogger or at my Facebook page
This entry was posted in Activism, Social media, Suicide and tagged , , , , , , , . Bookmark the permalink.

22 Responses to Goodbye Samaritans Radar… hello Facebook report button

  1. I’m glad it’s not just me that thinks this is a poor sticking plaster attempt by Facebook.
    I’d like to think that my friends on Facebook, and any groups I’m in, would actually try and contact me personally if they’d seen something that caused them concern. I know I’ve done that when I was worried about a friend who I knew was struggling with PPD and posted something concerned me. Perhaps working towards ending the stigma that surrounds mental health issues and making people understand what’s actually going to help us instead of an anonymous button on a social media page that doesn’t promote compassion, empathy or support, just allows people to “feel better” that they can “do something” which actually may not be that helpful.
    Hope am making sense. Sick child, sleep deprivation and lack of caffeine = incoherence! 😜

  2. Thanks for this info – I was not aware of this change and it will inform my research on the psychological well being of healthcare staff – thx – great blog!

  3. When you call you are linked to The National Suicide Prevention Hotline and their (well) trained volunteers who can get you to local resources as well. Dr. Ursula Whiteside (psychologist) and founder of fought for attempt survivors to be active in the development of the program. She is very active on social media and probably would be the best to answer some of your concerns. I am not sure how it is actually going in terms of intrusiveness. The emphasis of choice over being imposed upon was certainly considered by developers.

    • Hi, you are clearly talking about the American version. As yet we do not known who will be doing the assessing or providing the support in the UK. If this the Samaritans (our main suicide hotline) this would be ironic in the extreme given their own disastrous attempt at a similar intervention. I am asking very specifically how it would be done here. Also I would like to know what happens when suicide references are mention on FB pages? I have a page as part of my business and I post my blog entries – which often mention suicide – there. Will I have this dialogue box on my page? I can tell you straight if this is rolled out in the UK I will be going on FB less and less and being less and less honest about my life. Well done.

  4. It worries me that I could be logged out of fb and therefore unable to contact some of my best supporters via messenger, exactly when I need them most :-\

  5. Charlotte,
    was too tired to find this last evening but here is a story explaining the development of the program and how it works. . My understanding is it simply prompts you if you want help. You can easily decline this without any retribution or punishment.

    • Wow. The fact that you even use the words retribution and punishment is pretty scary.

      • Good point and maybe too strong of wording to convey my feelings. I think we agree that we don’t want people to feel like they imposed upon or punished for expressing feelings on Facebook. The American model, or so I am told will not do that. It will simply offer the support and you can decline it if you want. The hope being that the prompt will give someone pause.

      • Yes but the whole point is: what if you don’t want it to be offered to you?? Why should I have to decline help any time someone who knows nothing about mental health decides I need it offering? Why can’t I just opt out and say that I simply do not want to be a party to this invasive intervention?

      • Social media is such a large part of people’s lives. The stakes of a completely hands off policy for social media are too high as people are dying. So is infringing on people’s civil liberties and expression. This is a debate that is still in it’s infancy. Clearly there is a need to create a safe space on social media. One that will acknowledge that suicide is a huge public health problem but respect people’s right to express themselves. Just not sure what that would look like?

      • I know it’s a difficult balance, but the whole point of social media is that it’s *social*. If I look like I’m suicidal (and if you look at my Twitter RN that is very much the case) I am sharing that with friends, the world, whoever. If someone’s a genuine friend they will message me and say, “Hey, Charlotte, are you OK? Bit worried about what you said there.” People talk about suicide ALL THE TIME without doing it. If unwanted help pops up all the time as a response they will shut up – or leave. Thankfully I would never really talk about my issues on FB anyway – because I don’t think it’s an appropriate environment. Old school friends, former work colleagues, my mum’s cousins I haven’t seen in 15 years…they would press the bloody button because the lack the knowledge and connection to ask me privately. I’ll be going on FB even less now.

      • 1st and foremost, hope you are well. It is a great support system you have and I hope it helps. That is what it is all about, finding a safe community that works for you. On Facebook, perhaps there should be more “closed groups” of people you can count on and know your “baseline” a littler better. Not sure how the privacy settings come into play. As always I appreciate your fresh and thought provoking posts. You are a great advocate and asset to the mental health community on social media.

  6. I am extremely happy that Facebook now has this option however I see some issues that most likely will occur. Unforantenly, those issues have already occurred. I know this because I just experienced one of this issues last night. I’ve never expressed suicidal thoughts or gestures on Facebook and never intend to due to the fact that its a very personal issue and I don’t want everyone in the world know I am struggling. The people closest to me know when I am struggling so they can look for the signs (of Suicide) in me. I tell you this because I know of people who do share their suicidal thoughts and gestures on Facebook. Some do it because they are honestly and truly suicidal and desiring someone to help support them through a tough moment. Then there are those who do it for pure attention. Then there is the issue I experience last night. I had put on Facebook that “I love my job however the last few days have been pretty intense due to clients behavior. It makes me realize how difficult I was in the midst of my own personal struggles. It because of my clients on why I choose to be in recovery. It pains me when I see my clients struggling and such agony, knowing how they are feeling and wishing that I had a magic wand to wave and make everything better.” Apparently, one of my Facebook friends who has known me for a decade decided to “report” me to Facebook who in turn took down my post and called the police. The police determined that I was NOT a danger to myself. In fact the police and myself laughed over the situation because this friend of mine who lives on the other side of the question knows my phone number and could have easily called me to see how I was or call a mutual friend (who lives in my area) and have them check on me. It is my concern that people will read too much into what someone says on Facebook and then in turn Facebook overreact and call the police when it is not needed and waste tax payer money. On a good note Facebook to send me an apology email and put my post back up.

  7. demonicdivas says:

    It’s a tricky question isn’t it. On the one hand do companies such as Facebook find themselves at the end of a lawsuit or similar for not having taken steps to protect vulnerable people? But I do agree with you. It’s frightening enough to have to deal with suicidal thoughts and actually quite brave to voice it on a social media platform. In the past perhaps this might have prevented more suicides taking place as people felt safe to share their thoughts. That safety will now be removed. I know I would feel desperately uncomfortable knowing that some third party might come and zone in on my messages. It would actually make my paranoia worse and for some, have the opposite effect than the desired one of preventing suicide. It is a significant loss of privacy. A good article thank you.

  8. This is why I blog!! Seems that here there is no awfully intrusive button to report me. The Facebook stuff is always mediated somewhat because I have a horrible fear of the ex finding my FB details again. x

  9. chinorossioc says:

    Reblogged this on Jamie's Mental Health Blog and commented:
    Interesting, really interesting. It’s a minefield, so I ponder the question. Is this likely to make a positive impact on the whole, or a negative? What do you think?

  10. Pingback: Where the damage is done | purplepersuasion

  11. Pingback: Social media: Look up or look down? | Bipolar :): Burst

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s