Skip to content   Skip to footer navigation 

Should you trust an AI chatbot with your mental health?

Millions are using Headspace, Wysa, Youper and other popular therapy chatbots, but can AI replace a professional counsellor?

illustration of person being treated by chatbot for their mental health
Last updated: 27 February 2025
Fact-checked

Fact-checked

Checked for accuracy by our qualified verifiers and subject experts. Find out more about fact-checking at CHOICE.

Need to know

  • AI chatbots for mental health and therapy are growing in popularity
  • Experts are raising concerns about both the efficacy of the tools and the safety of sensitive user data
  • Academics say apps making misleading claims could breach consumer law

This article mentions suicide. If you or anyone you know needs support, contact Lifeline on 13 11 14 or at lifeline.org.au, or Beyond Blue on 1300 224 636 or at beyondblue.org.au/forums.

Early in 2024 Sarah* started using the generative artificial intelligence (AI) tool ChatGPT for small administrative tasks.

"I would just experiment with it, for the start of a business correspondence or instead of using Google for something I would use ChatGPT," she says. 

After a while she began using it as a tool to supplement her mental health therapy, using the AI like a "sounding board" between her psychologist sessions. 

"I would tell the AI my attachment styles and ask it how best to respond to certain situations. I would ask it to keep an eye out for any red flags in the early stages of dating," says Sarah. 

You want to feel like you have been heard and articulating that to someone, or something else, can be really helpful, kind of like a diary

'Sarah', AI therapy chatbot user

"You want to feel like you have been heard and articulating that to someone, or something else, can be really helpful, kind of like a diary." 

Sarah felt like the AI chatbot helped supplement her therapy sessions rather than replace them, and says her therapist wasn't discouraging about her use of it.

However, now Sarah is no longer seeing her therapist and is still using ChatGPT, while also experimenting with the new popular AI tool DeepSeek. 

"I do have concerns. There is [the issue of] privacy, and then also I think 'am I living more of a bubble life in an echo chamber after COVID, is this just a further way of isolating myself?'" 

The rise of therapy chatbots 

While some people, like Sarah, are using general AI chatbots like ChatGPT, DeepSeek, MetaAI and Gemini for mental health-related assistance, there are also a rising number of AI products designed specifically to help with mental health.

The US for-profit sleep and meditation mental health app Headspace (not to be confused with the Australian youth not-for-profit organisation of the same name) is in the top ten mental health websites visited in Australia. In October last year, it launched Ebb, "an empathetic AI companion integrated into the app to help people navigate life's ups and downs". 

The Headspace app has reportedly been downloaded more than 80 million times across various app stores. 

Other major players in the AI therapy chatbot space include Wysa and Youper, both of which have over a million downloads in the GooglePlay Store. Youper says it has helped over 2.5 million users.  

person to person mental health therapy

Not a replacement for therapy

While mental health therapy apps don't specifically claim to replace in-person psychologists, they do make varying claims about their ability to treat mental health conditions. 

Headspace's Ebb

Headspace's page promoting Ebb says it was developed by clinical psychologists using scientific-backed methods. 

"Ebb is an AI-powered tool that's designed to help you better understand yourself. While therapy and coaching provide deeper emotional support, Ebb can help you maintain mental wellness by encouraging regular reflection and mindfulness," the website says. 

Further down the website says, "Ebb is not a substitute for medical or mental health treatment. If you need support for a mental health condition, please talk with a licensed provider".

Headspace tells CHOICE that Ebb is not a "therapy chatbot" but rather "a sub-clinical support tool for our members to process thoughts and emotions or reflect on gratitude". 

"We acknowledge that Ebb, like any AI tool, has limitations. It's not intended to replace professional mental health care but to complement it by helping users manage stress, practice meditation, and engage in self-reflection," a spokesperson says. 

The spokesperson added that the chatbot has ways of detecting high risk situations such as suicidal thoughts or self-harm ideation, and will suggest that users contact emergency services. 

Youper

The Youper website makes the point that the supply of mental health clinicians is not keeping up with demand for mental health services.

"The groundwork of Youper is evidence-based interventions – treatments that have been studied extensively and proven successful. Youper has been proven clinically effective at reducing symptoms of anxiety and depression by researchers at Stanford University," the site claims,  citing  83% of users experience "better moods".

Youper did not respond to our questions about the scientific underpinnings of these claims. 

Wysa

Wysa says its product is designed to complement traditional therapy and undergoes continuous compliance and safety testing to make sure users don't receive harmful or unverified advice. 

"While ChatGPT and Gemini are powerful general AI models, Wysa is purpose-built for mental health with strict clinical safety guardrails," the chief of clinical services and operations at Wysa, Smriti Joshi, tells CHOICE. 

The unknown potential harms

Professor Jeannie Paterson, director for the Centre of AI and Digital Ethics at the University of Melbourne, says there is a lot we don't know about how AI therapy chatbots function.

Paterson says there is a big difference between using the technology in collaboration with a professional and going to an app store to buy something we "know very little about". 

"They could be causing you harm, or you're paying for something that does you very little good, because it's just not tested." 

They could be causing you harm, or you're paying for something that does you very little good, because it's just not tested

Professor Jeannie Paterson, Centre of AI and Digital Ethics

Previous media reporting around the world has highlighted some of the potentially harmful algorithms used in therapy chatbots when things go wrong.

In 2018, the BBC reported that therapy apps Wysa and Woebot, which were being promoted to children in the UK, failed to suggest to a journalist posing as a child who was being sexually abused that they contact emergency services and get help. Both apps have had significant updates since then. 

According to National Public Radio in the US, in an apparent cost-saving measure, the US-based National Eating Disorder Association fired their phone-line staff in 2023 and began promoting an AI chatbot instead, which went on to suggest dieting advice to people with eating disorders. 

person lying on sofa at home talking to chatbot on tablet phone

Experts say there's a big difference between using a therapy chatbot in collaboration with a professional, and buying something from an app store we know little about.

Meeting a demand 

Piers Gooding, an associate professor at Latrobe University law school, has researched mental health chatbots extensively. 

He says he expects usage to continue to grow, simply because so much money is being poured into their development worldwide. Australia's healthcare system is more effective than that in countries like the United States, but there are still sizeable gaps in demand. 

"In 2021, according to one market report, digital startups focusing on mental health secured more than five billion dollars in venture capital – more than double that for any other medical issue, and investment further increased in 2023," he says. 

I suspect there will be some kind of reckoning with some of the over-claiming about what they can do

Associate professor Piers Gooding, Latrobe University law school

"I suspect there will be some kind of reckoning with some of the over-claiming about what they can do, and that might come in the form of people just voting with their feet and realising that it's not quite what it's cracked up to be." 

Australian Association of Psychologists Inc policy coordinator Carly Dober says she understands that in a cost-of-living crisis, many people can't afford to seek the mental health care and support they need and chatbots appear to be a more affordable option. 

"I don't think it's the fault of people for trying to find whatever they can to support themselves in the moment. But unfortunately, when there is that kind of vacuum, sometimes not the most helpful players will try to fill that market or that space," she says.

Lack of regulation? 

Dober says there is a big difference between using a chatbot in conjunction with therapy, versus replacing therapy with AI, which she says lacks the "checks and balances" needed. 

"There is no uniform law around AI chatbots, there is a lack of regulation of that space, whereas we as psychologists are a highly regulated field," she adds. 

TGA exemptions

Latrobe University's Gooding says the Therapeutic Goods Administration (TGA) seems to provide an exemption to AI therapy chatbots being regulated as a "medical device" if the providers take steps such as:

  • working from widely accepted cognitive behavioural therapy (CBT) models
  • not providing experimental therapy 
  • not diagnosing mental health conditions. 

The TGA says it is currently "reviewing the ongoing appropriateness" of these exemptions and that it will consult stakeholders before suggesting any changes to government. It adds that to date it has received no complaints about therapy chatbots.

'Innovation important'

Gooding says that despite the lack of TGA regulation, under Consumer Law claims made by app providers about the benefits of their products must not be misleading or deceptive. 

He adds that innovation in the technology space is important and that a "heavy handed" regulatory approach from the TGA might not best serve consumers. 

Chatbots can't capture nuance, and they can be easily programmed to have addictive elements and pretend to be real people

Piers Gooding, Latrobe University law school

"Regulatory change is almost certainly needed to remove the digital mental health app exemption in relation to chatbots. Chatbots can't capture nuance, and they can be easily programmed to have addictive elements and pretend to be real people. That poses a real danger in the mental health context," Gooding says.

*Not her real name

We're on your side

For more than 60 years, we've been making a difference for Australian consumers. In that time, we've never taken ads or sponsorship.

Instead we're funded by members who value expert reviews and independent product testing.

With no self-interest behind our advice, you don't just buy smarter, you get the answers that you need.

You know without hesitation what's safe for you and your family.

And you'll never be alone when something goes wrong or a business treats you unfairly.

Learn more about CHOICE membership today

We care about accuracy. See something that's not quite right in this article? Let us know or read more about fact-checking at CHOICE.

Stock images: Getty, unless otherwise stated.