Using ChatGPT As Your Therapist
Working in the therapeutic space is tricky at the best of times, but none so more than now with the rise of ChatGPT and other AIs. More and more I hear and see people using these platforms as their therapists, chatting with them at any time of the day or night, divulging their deepest worries. There is very real concern within the industry that counsellors and psychotherapists will be replaced all together. But I think it’s more complicated than that. Although I can totally see the allure and potential benefits of generative AI chatbots in a therapeutic space, there is more to the story.
A 2024 study by Paolo Raile found results that suggest ChatGPT offers an interesting complement to psychotherapy and an easily accessible, good (and currently free) place to go for people with mental health problems who have not yet sought professional help and have no psychotherapeutic experience. However, it must also be made clear that the proposals are not only insufficient as a psychotherapy substitute, but also have a bias that favours certain methods while not even mentioning other approaches that may be more helpful for some people.
Let’s look at the pros and cons of using ChatGPT and other AI bots as a therapist:
The Pros
Instant access and availability, no wait times, no appointments
Free (for now)
No human to talk to means no external influence or judgement
Can provide good ways to clarify thoughts, reframe ideas and explain concepts in ways you understand
The Cons
There is no human connection. I cannot tell you how many times within my practice that clients just want someone to talk to, to sit with, to be with them. Co-regulation (sitting with a calm human) has proven physiological effects on your nervous system.
AI does not have your best interest at heart like a therapist is trained to. They have goals of harvesting data before they have goals of helping you.
There are huge ethical concerns, as this data is going somewhere, and not being held with confidentiality. Who has hold of your deepest thoughts? What are they using it for?
They can’t respond or notice non-verbal cues, such as noticing when you’re avoiding eye contact, hesitating, or fidgeting. These are all extraordinarily helpful markers in therapy to let us know when to speed up, slow down, or change something.
In addition, there is a lot of power in sitting with silence in therapy spaces as a way to process emotions. Obviously, ChatGPT can’t do that.
There is no accountability for your healing and no way to challenge you; there are lots of reports about ChatGPT having incredibly sycophantic, intensely complimentary behaviours, which - although might be nice to read - won’t help you grow.
There is no capacity for real change; these AIs can provide great symptom management strategies and challenge certain thoughts, but therapy is about doing the actual work and getting deep within yourself, figuring out where these feelings come from instead of just offering solutions (Dazed, 2025).
This won’t be relevant to everyone, but an AI can’t refer you within a network of other professionals if needed, say, to aid in a diagnosis or accessing further medication.
With all AI, there are huge environmental concerns. AI servers require huge amounts of water and produce mountains of electronic waste.
It is also worth adding in something here – that although these AIs might sound human, they are not; “they use pattern-matching and data scraping, producing human-like speech that is believable enough to convince some people that it can act as a form of mental health support” (Vice, 2025). For some people, that might be exactly what they want and need from a therapeutic relationship, and for those people, I am so happy they have a place to go. But for everyone else, it’s important to note that these bots can’t understand your emotional state, even if you prompt them. They are NOT human.
For me, I can really see the ease and usefulness of AI platforms to help with reframing, organising thoughts or top-line coping strategies. It stands to reason that it can be used as a wonderful addition to therapy, like journaling on steroids, and with the cost of in-person therapy being what it is, it’s a great way to work through life stuff when it suits you.
But it is important to remember that with therapy, the relationship is the healer. All the stuff you want to talk to ChatGPT about is great, but it is not therapy. If you want to create lasting change in your life and work through mental health conditions, AI is a helper, not a healer.
To finish, I want to end with this quote from First Session which I think really encapsulates the dilemma;
“Therapy is not just about logic. It’s about relationship, transformation, and human connection.
AI therapists and therapy will get better. It will feel more human-like, more personalized, more responsive.
But it will never be truly human.
And when it comes to therapy, that distinction makes all the difference.”