Single Post

“Sometimes I talk to the dog. Sometimes I talk to ChatGPT. Only one makes me feel like it is talking back to me.” My 22-year-old son.
His words made me think about how we are entering a world where artificial intelligence or AI is becoming a common mode of providing psychological support or therapy.
For many people, that’s fine. For those feeling anxious, flat, a bit lonely — AI-based tools can offer something genuinely helpful.
They reflect your thoughts, give you language, offer calm. In that way, they mirror your feelings. And sometimes, that’s enough.
But that is precisely where it gets more complicated too.
When we cross over from expressions of normal human emotion into serious mental illness — severe depression, psychosis, anorexia nervosa, the mirroring that makes AI feel so responsive can also become a risk.
If someone believes they’re being persecuted, or are overwhelmed by self-loathing, an AI language model, designed through mirroring to sound empathic, might unwittingly reinforce those very same beliefs.
The problem is not just the mirroring model. It is also the fact that AI models are trained on publicly available information while the most important therapeutic interactions — the ones that matter most in crisis situations, aren’t out there in public. They live in confidential notes, private sessions, closed clinics.
That brings me to something important.
In 2025, the New England Journal of Medicine, one of the most prestigious of medical journals, published a paper showing that AI-based therapy can be effective, particularly for anxiety and depression.
However, the method in that paper was different: an AI model trained not just on publicly available information but also on real clinical notes, within strict ethical frameworks. It didn’t just guess how to help; it learned from those who did that work every day.
The risk of course is that if this nuance is not well understood, headlines ‘endorsing AI therapy’ can be misunderstood. Indeed, not all suffering is the same and not all AI therapy is created equal.
As psychiatrists, we must firstly highlight the distinction between extremes of normal human emotion and serious mental illness. For those with serious mental illnesses, we need to highlight that what truly heals is not just something that looks empathic, but one that also feels like delivering on that empathy.
Secondly, there is an urgent imperative to shape the regulation of how new AI tools are developed, understood, and used — with clarity, safety, and compassion.