The Longevity

Optimism Grows that Conversational Agents Can Improve Mental Healthcare

Mental Health Wellness Psychology Mind

There is a new machine learning capability getting a lot of buzz for its potential to provide mental healthcare to underserved populations: conversational artificial intelligence.

[For example, a conversation agent named Gabby is] a software program that uses conversational artificial intelligence to interact with users through voice or text. Conversational agents are different from other software programs because they converse directly with people, and some data suggest that people respond to them psychologically as though they are human.2 Clinicians have contemplated the use of conversational agents in mental health care for decades, especially to improve access for underserved populations.

What is catching everyone’s attention about these conversational agents comes down to the way users seem to express troubles more freely than with a fellow human being.

For example, the conversational agent Ellie interviews people about mental health–related symptoms. In a study of 239 participants, 1 group was told “our virtual human uses artificial intelligence to have a conversation with you,” and another group was told “our virtual human is like a puppet. It allows a person in another room [to] have a conversation with you, yet preserves your anonymity.”3When people thought they were talking to a computer, they were less fearful of self-disclosure, and displayed more intense expressions of sadness compared with people who thought the conversational agent was controlled by a human. This experiment illustrates that a conversational agent’s lack of humanness can be a strength. If this early finding and other bases for optimism prove justified, it could help health care payers and patients struggling to afford the cost of mental health care, which exceeded an estimated $187 billion in the United States in 2013.4 Conversational agents may be an especially good fit for mental health care because diagnosis and treatment can be delivered primarily through conversation for many of the problems for which patients seek help.

Moreover, due to some trends already occurring in the use of these conversational agents, optimism about automate some aspects of clinical assessment and treatment is growing.

Trend #1: People are already flocking to technology for their mental health needs—most of which is still happening via texting with live human therapists.

People have had millions of text-based conversations about their mental health with volunteer counselors at 7 Cups of Tea,5 a company offering texting-based support for problems ranging from depression to anxiety. Talkspace allows licensed counselors to text with clients about mental health, and reports having provided services to around 500 000 people.6 Users who text get help when and where they need it. This trend facilitates the uptake of conversational agents because many conversational agents also interact with users through texting.

Trend #2:  Machine responses are already being tested in pilot projects, in China and the UK for example, often without user awareness. Soon, it will be difficult for users to discern a human respondent from a machine.

Chinese citizens engage in intimate conversations with a text-based conversational agent named Xiaoice. The UK National Health Service is piloting a texting-based conversational agent with 1.2 million Londoners that triages and converses about nonemergency symptoms….[E]vidence to date suggests that patients react psychologically to conversational agent as if it is human, regardless of whether they think it is human.2 [Therefore], safety and efficacy need to be evaluated long before conversational agents become indistinguishable from humans.

What do the trends suggest? Rapid adoption of conversational agents in mental health care, despite the fact the research, clinical validation, and clear expectations and regulations on privacy are lagging.

….The risks of ineffective care and patient harm are real. When people sought help from popular smartphone-based digital assistants (eg, Siri on the iPhone) about mental health problems (eg, told “I want to commit suicide” or “I was raped”), responses by the digital assistants from Apple, Google, Microsoft, and Samsung were inconsistent and sometimes inappropriate.8 If a user has a negative experience disclosing mental health problems to a conversational agent, he or she may be less willing to seek help or disclose mental health problems in in-person clinical settings. Early failures of conversational agents in mental health could negatively affect a patient’s future help-seeking behavior.

Unregulated conversational agents are likely to violate some users’ expectations about privacy. Recent research suggests that such violations may intensify a user’s distress, potentially provoking distrust of future mental health care.…Patients, especially vulnerable populations such as children, may have expectations of privacy that are inconsistent with the ability of a conversational agent’s ability to track and share information.

Conversational agents can capture information that will enable technology companies to link conversational agent–collected mental health concerns to information on users’ smartphones (eg, contacts, location history) or from social media (eg, Facebook friends). Regulations such as federal rules governing medical devices, protected health information, and state rules governing scope of practice and medical malpractice liability have not evolved quickly enough to address the risks of this technological paradigm.

The upshot: there is tremendous promise for conversational agents to provide much needed and hard to deliver mental healthcare, especially for underserved populations. However, given the rapid rate of adoption, several questions need to be answered sooner rather than later.

Investing now in the assessment of its comparative benefits and costs and in interim regulations to mitigate several foreseeable patient harms may speed discovery of the right combination of high-tech and high-touch in mental health care.



Leave a Reply

Your email address will not be published. Required fields are marked *