
Millions of people are turning to AI for emotional support. The hope is clear: therapy that is affordable, immediate, and available at any time. For many it feels like relief — or at least, a companion. But the question is not technological. It is deeply human: Can pain truly be understood by a machine?
According to The Economist (November 15th, 2025), "Machines of Loving Grace – Electronic Therapy: Millions Are Turning to AI for Mental Health. Is It Ready for Them?" Print edition, Science & Technology section, pp. 72–73:
Popular AI Therapy Apps:
What AI Therapy Promises:
✔ 24/7 access — no waiting lists
✔ Affordable — often free
✔ No judgment or social shame
✔ Emotional naming & psychoeducation
✔ Structured CBT exercises
✔ Privacy (though data protection remains questionable) — therapy from bed, train, or bathroom.
AI gives structure to an overwhelmed mind — but structure is not the same as healing.
As a person with developed self-reflection, I sometimes use AI for emotional reflection. And I often contradict it. AI tends to rush emotional processes.
When I speak about anger or injustice, it often attempts to reduce pain too quickly — with responses such as:
"Try to understand them," "look at the positive side," "maybe let it go."
In those moments, AI feels parental… almost sycophantic, trying to calm me rather than stay with the complexity of the emotion.
But psychological truth is different:
Forgiveness cannot be forced, especially when trauma or betrayal are present.
Pain needs to be witnessed before it can transform. AI offers relief — but it cannot offer attunement.
AI limitations and their psychological impacts:
AI may speak empathetically — but empathy without presence risks becoming imitation.
The concern about "emotional bypassing" deserves deeper examination. Is the issue AI itself, or are we deploying tools designed for structured interventions (like CBT) in contexts that require relational depth? Different therapeutic modalities have different aims. CBT excels at symptom management and cognitive restructuring — tasks AI can support. But depth psychotherapy, trauma work, and attachment repair demand something AI cannot provide: a living relationship where the therapeutic process unfolds through mutual regulation and authentic presence.
We must ask not only if AI can help, but what kind of help we're offering — and to whom.
Probably not. But it will change therapy.
AI may become:
But it should remain a tool, not a replacement for the relational depth of therapeutic contact.
AI is not simply technology.
It is a symptom of our society — rapid, overwhelmed, craving help but afraid of exposure.
We ask AI to listen… because sometimes it feels safer than being seen.
And perhaps that safety reveals something profound: our deep ambivalence about human vulnerability itself. Maybe AI therapy isn't just filling a gap in access. Maybe it's exposing our desire for healing on our own terms, without the risk of being truly known—without the messiness of being witnessed in our raw, unedited pain.
That's not a failure of AI.
It's a mirror of modern loneliness.
AI may give guidelines — but only a human can give attunement. For further reflection: What are we afraid of when we choose the machine over the person? And what might we be losing in that choice — not just therapeutically, but as a culture learning to be with pain?
Olena Tertyshnyk, Psychotherapist 23 November 2025