Thu, Nov 21, 2024 | Jumada al-Awwal 19, 1446 | DXB ktweather icon0°C

UAE: Can AI replace human therapists?

AI chatbots in therapy: Game changer or just a fad?

Published: Thu 12 Sep 2024, 9:06 PM

Updated: Mon 16 Sep 2024, 9:35 AM

  • By
  • Ghenwa Yehia

Top Stories

What if you knew that even professional therapists often can’t tell the difference between Artificial intelligence (AI)-driven therapy and therapy conducted by a real person?

That’s exactly what a new study has found. Research published in the International Journal of Human-Computer Interaction examined how mental health professionals rate conversations between therapists and AI chatbots. The therapeutic transcripts were limited to a few interactions and reflected the early stages of therapy, where active listening is a key principle.

Results of the interdisciplinary study found that therapists couldn’t reliably tell the difference between transcripts of human-AI sessions and human-to-human ones. In fact, they were only accurate about 53.9 percent of the time, which is essentially just guessing. Even more intriguing, the therapists rated the AI chatbot conversations as higher quality on average.

These results indicate that AI chatbots may be more effective than we previously thought.

Dr. Justin Thomas, a psychologist in the digital wellbeing programme, Sync, at the King Abdulaziz Center for World Culture, Ithra, was a collaborator on the project. He pointed out: “This is not about replacing humans; this is about enhancing access. It’s essential that the introduction of AI does not simply become a cost-cutting exercise. It should be about quality improvement and broadening access. Mental health is a huge issue across the modern world, and we need to do better, not simply tick a box.”

In regions where cultural sensitivity plays a significant role in mental health care, the distinction between AI and human therapists becomes even more important.

Mohammad Amin Kuhail

Mohammad Amin Kuhail

“The differentiation between AI and human therapy is critical in this region because therapists are expected to provide culturally sensitive care,” said co-author Mohammad Amin Kuhail, Associate Professor at the College of Technological Innovation at Zayed University in Abu Dhabi. “As such, when clients know the source of the therapy, they can make an informed decision based on their comfort. They might prefer a human as humans understand the cultural norms compared to AI.” Other co-authors, Dr. Alturki, Dr. Alkhalifa, and Dr. Alshardan from Princess Nourah Bint Abdulrahman University, Riyadh, Saudi Arabia, echoed this sentiment.

Cultural sensitivity is a key component of effective mental healthcare, especially in the UAE. Therapists are expected to have an understanding of local customs and values.

Seeing this gap in the market, Dr Nawal Yousaf, a 25-year-old medical doctor, created a faith-based mental healthcare app during her time at King’s College London. The app, called Fitra Health, uses a simplified version of AI. It is based on rule-based systems/expert-based systems that process user input and provide personalised recommendations. Support for users draws on Islamic stories, role models and Qur’anic quotes, blended with Cognitive Behavioural Therapy (CBT) and Behavioral Activation Therapy (BAT).

Dr Nawal Yousaf

Dr Nawal Yousaf

“To develop Fitra Health, we collaborated with Islamic experts who review all religious content. This collaboration ensures that our app remains authentic and respectful of Islamic teachings,” Dr. Yousaf said. “Our app’s Behavioral Activation (BA) tool recommends specific Islamic practices such as reciting duas, prayers, and engaging in charitable acts as part of a treatment plan. These activities align with Islamic values and promote mental well-being, respecting the cultural and religious preferences of our users.”

Fitra Health won the Best Research Poster award at the 4th Abu Dhabi Integrated Mental Health Conference hosted by Abu Dhabi Health Services Company – SEHA. Users rated the app as better than in-person counselling, in some cases, where culturally sensitive care is not available.

One of its users, Amal Baraka, 24, turned to the app when she experienced depressive symptoms for several months as a medical student. She felt the shame and stigma still attached to mental health, so she chose to seek counselling instead of guidance from a religious leader. But when she went to her British university’s health centre, she felt as though it was, “a rushed meeting, and they didn’t fully understand me.”

That’s when she was introduced to Fitra Health, and tried the eight-week, self-guided, faith-based CBT programme. “I appreciated that the app approached mental health in a holistic manner addressing medical symptoms and spiritual and religious beliefs. Fitra Health highlights how Prophet Muhammad (PBUH) faced hardships, offering valuable lessons. Through these

stories, CBT tools, and weekly homework, I stopped believing my struggles were a punishment from God. It is a work in progress, but I believe I am in a much better place mentally and spiritually.”

Dr Thomas explained that while AI chatbots may be useful, it’s essential that people know how the technology works. “AI chatbots are not caring, compassionate beings. They are a tool, a digital coach that can help you vent or get you to think about things in a different way. AI literacy in that regard is key, as is policy and legislation ensuring ethical AI”

So while AI may never fully replace human therapists, its ability to provide accessible, culturally sensitive mental health care is undeniable. In places where human therapists are hard to come by or unaffordable, AI may be a crucial tool in offering support to those who need it most.

“We may always see AI chatbots as second best,” said Dr Thomas, “but in many cases, something is better than nothing."

wknd@khaleejtimes.com



Next Story