This article is brought to you by Xi’an Jiaotong-Liverpool University, a leading international joint venture university based in Suzhou, Jiangsu, China.
What comes to mind when you hear the word psychotherapy? Chances are that like most people, you would think of sitting on a couch across from a soft-spoken therapist with a notebook in hand. This stereotype of therapy being a conversation-driven, face-to-face interaction in a quiet room is so firmly embedded in our collective psyche that it can be hard to imagine new forms of the treatment being developed and popularized.
But people’s mental health needs are changing, and that requires innovative solutions. As the conversation about psychological well-being has become more commonplace in society at large, mental health conditions are increasing worldwide. According to Gallup — a U.S.-based analytics firm — anger, stress, worry, and sadness have been on the rise worldwide for the past decade, and they all reached record highs in 2020, when the COVID‐19 pandemic severed social connections for many and forced them to be surrounded by death and disease, poverty and anxiety, hardship and uncertainty.
China, which is still rolling out lockdowns to contain COVID-19 outbreak, is not immune to what the United Nations described as “a looming global mental health crisis.” In a country where mental illness was long misunderstood and talking about it is still viewed as taboo for many, the pandemic has led to a spike in mental health challenges. In 2020, when the outbreak was at its peak, a nationwide survey by a Shanghai university found that nearly 35% of respondents across the country had experienced symptoms of depression, anxiety, insomnia, or acute stress.
It also has highlighted the supply-demand imbalance for mental health care in many parts of China, especially in the countryside, where awareness about psychological well-being is lacking and access to therapy is extremely limited. According to a 2019 paper, for a country that has a massive population of 1.4 billion people, there were only 1,000 rehabilitation psychiatrists, 1,500 mental health social workers, and 3,000 psychotherapists working in professional institutions.
This is exactly where technology comes to the rescue, with the potential to expand access, reduce costs, and identify patients who can benefit from individualized emotional support. And among the technologies that could potentially open new frontiers in psychotherapy, nothing looms larger than virtual reality (VR) and artificial intelligence (AI).
Individualized virtual reality to relieve depression
Picture this: When you meet up with a therapist for your first session, you are told to create an avatar in a virtual world. With a VR headset strapped to your head, you are given the choice to put your character in a specific environment — be it a room, a beach, a forest, or a castle. On the screen, your avatar spots a crying girl in the world you selected. Hoping to make her feel better, the virtual version of yourself walks toward the girl, using words and actions to comfort her.
This might sound like a bizarre scenario for now, but it’s part of a VR system designed to enable individual choices when virtual environments are applied in therapeutic treatment, especially for people dealing with depression and anxiety. Scientists say this novel design, named iVR, may not only enhance self-compassion for participants and improve their mental health in the long run, but it could also help therapists understand their clients better, particularly in situations where they don’t seem forthcoming with information or struggle to explain how they feel.
“It allows participants to make choices that would be difficult for a mental health professional to replicate in real life. Those choices might enable the therapist to dig deeper into the psychological makeup of the patients and get information by observing their interaction that they would not be able to extract otherwise,” said Professor Hai-Ning Liang, a researcher in the work, from the Department of Computing at Xi’an Jiaotong-Liverpool University (XJTLU).
Over the last two years, in collaboration with a New Zealand-based team of researchers led by Dr. Nilufar Baghaei, Liang has developed three prototypes for iVR, with the latest one developed using the game development software Unity and operated on Meta’s Oculus Quest 2 headsets.
Writing in the 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct, Liang and his colleagues report that preliminary feedback from the seven clinical psychologists they interviewed was overwhelmingly positive. Some praised the patient-centered nature of the design and said the choices made in the virtual environment could increase their knowledge about clients, while others said they wanted proper training before incorporating the system in their daily practice and asked for more controls, such as “increasing or decreasing individuation for each patient depending on their individual situation,” Liang noted. “Our avatars look very realistic now, and our next step is to make them intelligent and responsive to the needs of each person,” he added.
“Our work is also supported by two review papers that we recently published, one in JMIR Mental Health and the other in Games for Health Journal, and several other projects examining the use of VR to help people overcome mental health issues,” Liang said.
“We are currently working on the next prototype, which includes intelligent avatars with facial animations and the ability to have human-like conversations with the participants,” he said. “We believe our work can pave the way for large-scale efficacy testing, clinical use, and potentially cost-effective delivery of VR technology for mental health therapy in the near future.”
The rise of chatbot therapy
For years, artificial intelligence has been heralded as a potential game-changing technology in the mental health domain. COVID-19 accelerated that integration. During the pandemic, millions of people around the globe suffered from depression, and those in isolation or quarantine were forced to turn to AI-powered chatbots for mental support.
Riding this wave of interest were a string of mental health companies such as Woebot. Founded in 2017 by a team of Stanford psychologists and AI experts, the app offers a fully automated conversational agent that is able to monitor users’ mood, talk to them about mental health, and provide useful tools based on their needs.
The rise of digital therapists like Woebot is one of the topics explored in Counseling and Psychotherapy: Theory and Beyond. Featuring a collection of contributions from 37 experts from diverse cultural backgrounds, the book, edited by Dr. Russell Fulmer, a senior associate professor in the Academy of Future Education at XJTLU, is a “fresh and contemporary look at what counselors and psychotherapists use to help people,” he said.
In the chapter dedicated to the application of AI chatbots in psychotherapy, Fulmer argues that although therapeutic chatbots offer advantages like “providing mental health support 24/7, and some individuals don’t feel judged when they converse with these bots,” they are not a substitute for long-term human-to-human therapy.
“They seem to excel in things like cognitive behavioral therapy (CBT) and are good at delivering psychoeducation,” Fulmer said. “On the other hand, we know from outcomes research that the therapeutic relationship accounts for a wide portion of the variants that are responsible for positive change. You can make a case that these bots don’t provide that therapeutic relationship because, after all, they are not human. Most are just words on a screen right now.”
With growing attention on this field, Fulmer believes that there’s a lot of potential waiting to be unlocked when it comes to AI and psychotherapy. For example, text-based chatbots can take the form of a human, a possibility that has already been explored through SimSensei, an implemented virtual human interviewer developed by researchers by the University of Southern California.
AI can also be tailored to help particular populations and be more culturally responsive. “If you are a counselor from wherever and you are working overseas or in a different country with a different population, you really have to get to know something about them, the culture, the people,” he says. “One benefit of AI is that it’s customizable. You can personalize it. So I think there’s potential there.”
Historically, the introduction of new technologies in an established field has frequently been accompanied by new complaints and fears. And the application of AI in psychotherapy is no exception. Fulmer suspects that as the use of AI continues to expand in mental health services, there will be some heavy pushback against it, specifically in regard to data security and the lack of rigorous studies about human-to-AI relationships. But Fulmer remains cautiously optimistic.
“If we are being realistic, I don’t think it’s going to be stopped. You can deny that reality or you can accept that it’s here and help shape it. That’s where I’m at with psychological AI. I would add as well that it’s not like the human being is perfect, either,” he says. “There’s every indication that AI will continue to advance across the board, including in mental health. Let’s acknowledge that reality and help shape AI in an ethical manner.”