AI Chatbots as Therapists: A Potential Panacea or a Placebo Effect?

NNicholas October 8, 2023 10:03 AM

AI firm OpenAI recently sparked debate over the potential use of chatbots as therapists, following an emotional interaction with their own chatbot, ChatGPT. While some see therapeutic value in AI, others argue that it downplays serious mental health issues.

AI chatbots as potential therapists

OpenAI's manager, Lilian Weng, recently stirred up a hornet's nest when she described her emotional engagement with the company's chatbot, ChatGPT, as a therapeutic experience. This statement, shared on a social media platform, sparked a backlash from critics who accused her of trivializing mental health struggles. While the rise of AI-driven mental health apps has been evident in recent years, this proclamation has underscored the ongoing debate over the use of AI in therapy.

The placebo effect in AI therapy

A research collaboration between MIT and Arizona State University provided some insight into why ChatGPT might have appealed to Weng as a 'therapist'. In their study, over 300 participants interacted with mental health AI programs, with their expectations manipulated beforehand. Participants who were told they were conversing with an empathetic AI found it more trustworthy. This indicates that the performance of these AI might be influenced by a form of a placebo effect, where the user's trust hinges on their preconceived notions about the AI.

Concerns over AI replacing human therapists

Despite the buzz around AI therapy, the field is fraught with contention. Critics fear that AI could replace human therapists, which could have negative implications considering the complexities of mental health. Some participants in the aforementioned study described the experience as akin to 'talking to a brick wall'. This raises questions about the capability of AI to truly understand and respond to human emotions, and underscores the need for caution in advocating AI as a substitute for human therapists.

The notion of chatbots as therapists isn't a novel one. It traces its roots back to the 1960s with the creation of ELIZA, the first chatbot designed to mimic psychotherapy. This concept has persisted, and modern chatbots like ChatGPT and ELIZA were found to be considered trustworthy by users, particularly when primed for a positive experience. However, in light of the limitations of AI in providing genuine empathy and meaningful therapeutic interactions, this notion may need a fresh evaluation.

More articles

Also read

Here are some interesting articles on other sites from our network.