Can robots ever replace psychiatrists?

Last week I was lucky enough to be on a panel for a Medfest event. Medfest is a medical film festival hosted at medical schools all around the country during February and March. Audiences at each event see the same selected short films on the theme of medicine, after which a panel (mostly consisting of doctors) shares their thoughts on the films. My event was chaired by Alex Langford (aka @PsychiatrySHO) and I joined two consultant psychiatrists and a paediatric SHO on the panel.

As you would expect, given the subject matter, we spent a lot of time talking about ethical or philosophical points (for example, whether a woman with dementia could give meaningful consent to being filmed; how much patients should be at the centre of films about medicine, etc). One film we saw was a science fiction short set in a dystopian future about a medical robot called Dr Easy.

I’m pausing for two big warnings, here. First, if you want to watch for yourself and avoid SPOILERS, skip the next paragraph. Secondly, big TRIGGER WARNINGS for suicide.

The film opens with a siege situation in progress, blue lights flashing and choppers buzzing overhead. A lone man is holed in up his flat with a shotgun, shot but not killed by a police sniper. The commanding officer’s next move is to deploy a robot medic. Dr Easy, who was vaguely humanoid in shape and programmed to have a soothing female human voice, enters the perpetrator’s flat carrying a bag of medical equipment, noting as she does so that there is “a strong odour of blood and of petrol”. It is clear as soon as she speaks to the perpetrator, Michael, that she is programmed to establish rapport with humans in distress. Michael has been shot in the mouth and cannot speak so it is down to Dr Easy to establish his needs and motives through eye contact, Michael’s attempts to write on the wall in his own blood, and Dr Easy’s access to online information about his financial and family problems. Just when it looks like an understanding might be reached and that Dr Easy might be able to coax Michael into putting the gun down, she makes an error in mentioning Michael’s son (I’ve written whole posts on why this is a bad strategy when dealing with suicidal people). Michael begins dousing himself with petrol and sets himself alight. There is a large explosion from his flat, after which we see Dr Easy exiting the building, partially alight. Partway down the staircase she stops, immobilized and presumably “dead”, insofar was we can speak of death in a robot.

Robots, by definition, have no physical sex and (as yet!) insufficient sense of self to have a sense of gender, so it was of interest to me that Dr Easy had been assigned a female voice. Although her role had elements of negotiation (“Shall I get someone to bring her [your ex-wife] here?”) it was primarily one of nurturing (demonstrating concern, establishing rapport, administering anaesthetic). Nurturing female robots are certainly nothing new. British science fiction writer John Wyndham wrote a story called Compassion Circuit as long ago as 1956 in an anthology called The Seeds of Time. The story opens with a doctor suggesting George purchases a robot nurse for his spouse Janet, an apparently depressed housewife. Janet and the robot, Hester, are alone together all day and become friends. Ironically, Hester’s state-of-the-art compassion circuit makes it difficult for her to tolerate Janet’s ongoing emotional pain. The logical step, at least in Hester’s mind, is to suggest Janet undergoes a procedure to lose the ability to feel distress. Janet agrees and has the procedure carried out, essentially become a robot herself.

Dr Easy prompted a discussion on whether doctors could ever really be replaced by robots (with Kier noting wryly that at times his working day seemed to consist of drawing blood and filling in paperwork, both tasks a robot could easily perform). It’s been suggested – particularly in Japan – that robot assistants could be developed to address the mobility needs and even the need for caring touch in an aging population.

That’s all a long way from a compassion circuit, but even relatively simple programming can generate a bot that appears to be expressing care and concern, even if it’s really doing no such thing. Eliza (yep, another female gendered nurture bot) is a text-based “computer therapist” designed to emulate the Rogerian approach to counselling (very briefly, this means Carl Rogers’ approach of putting the client at the centre of the interaction and holding them in “unconditional positive regard”). Try Eliza out for yourself and you’ll notice she seems really good at “listening skills” – lots of open questions, prompts for further information (“tell me more…” and “can you elaborate on that?”), bouncing questions back at you, and so on. Sometimes she even pauses a little, as if giving your responses some thought. I’d be interested to know what anyone who’s ever used online chat or instant messaging for therapy thinks of Eliza. I’ve had some pretty terrible psychiatrists in the past, including doctors who have failed to make eye contact, failed to read my notes, reflected back information incorrectly, or shown no flicker of having met me before. It certainly felt as if they lacked a compassion circuit. On the social interaction front, Eliza would be at least as good as a doctor like that, and presumably it would be possible to programme robots to manage at least some diagnostic or prescribing decisions based on patient data.

But I like to think – and I’m going to be trusting and believe Alex, who assures me of this – that psychiatrists are improving, that a more dedicated calibre of junior doctor is now choosing to specialise in mental health. I hope we’re both right, because when psychiatry works well it does so on a level far beyond what a diagnostic bot with a programmed compassion could ever achieve. Dr Easy was apparently good at reading expressions, but I don’t believe a robot could really identify the haunted look in my eyes when I’m paranoid and scared. It could look pleased to see me, but I would always know it was smiling because it had been instructed to do so. And a robot could never really support my recovery because it is impossible establish a rule for what recovery is. Eliza can “listen”, but she can’t feel joy or triumph, She can never look at a patient’s progress and think, “I helped make that happen.” We need more than a compassion circuit in psychiatric interaction. We need nuanced clinicians who are able to engage their empathy processor as well as their factual knowledge. I’m thinking of calling them “people.”

Advertisements

About purplepersuasion

40 something service user, activist, writer and mother living with bipolar disorder. Proud winner of the Mark Hanson Prize for Digital Media at the Mind Media Awards #VMGMindAwards 2013. Winner of the World in Mentalists Mood Disorder blog 2012. Regular guest blogger for the International Bipolar Foundation http://www.internationalbipolarfoundation.org/ Expert by Experience working with Mind training department. Working on The Incoming Tide, a bipolar memoir. Find me on Twitter @BipolarBlogger or at my Facebook page http://www.facebook.com/BipolarBlogger
This entry was posted in Film and TV, Mental health, Stigma and discrimination and tagged , , , , , , , , , , , . Bookmark the permalink.

4 Responses to Can robots ever replace psychiatrists?

  1. Martin Baker says:

    I use chat/instant messaging a lot in support of my friend who has bipolar disorder (also skype phonecalls and webcam). These different means of communication all have their place in our relationship, but it is certainly easier/more satisfying for me (for us both) when we have access to all those means. At times — most especially a 3 month period last year when she was traveling — we had no access to webcam and only intermittent access to voice calls. We had all the chat we wanted but it was far harder to keep properly in touch and for me to help her remain aware and vigilant of her own mental and physical state. I have just given Eliza a quick try and wasn’t impressed. I think your “people” idea has a lot going for it. Good luck with that!

  2. goldenpsych says:

    I think my psychiatrist is a robot. He seems to have the same programmed responses to everything and doesn’t have any empathy.

  3. J says:

    My current psychiatrist has just retired and as yet I’ve not been allocated another one. I question whether (although I’m classed as complex needs with enduring mental illness) I really need another psychiatrist. Most of my others ones didn’t listen anyway so maybe I might as well have a robot at least it would save money if everyone did and we would get the same level of service. Also I might not get so cross about the lack of shared decision making!

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s