AI Therapy: A Promise, a Risk - and a Mirror of Our Human Need

Millions of people are turning to AI for emotional support. The hope is clear: therapy that is affordable, immediate, and available at any time. For many it feels like relief — or at least, a companion. But the question is not technological. It is deeply human: Can pain truly be understood by a machine?

The Rapid Rise of AI Therapy

According to The Economist (November 15th, 2025), "Machines of Loving Grace – Electronic Therapy: Millions Are Turning to AI for Mental Health. Is It Ready for Them?" Print edition, Science & Technology section, pp. 72–73:

  • 1.2 billion+ mental health app downloads in 2023 (Sensor Tower)
  • 25% of UK adults would try AI for therapy or already have (YouGov poll, Oct 2024)
  • Global shortage of over 1 million mental health professionals (WHO)
  • Some AI-based CBT tools show up to 19% reduction in depression symptoms after six weeks
  • AI tools are increasingly tested in national health systems — UK (NHS), Singapore, India. AI is not the future of therapy. It is already competing with it.

Popular AI Therapy Apps:

  • Wysa – AI chatbot using CBT & DBT, used by NHS and Singapore Ministry of Health
  • Woebot – CBT-based emotional check-ins; Stanford trial showed reduced anxiety in two weeks
  • Youper – Mood analysis and medication reminders; used in private US clinics
  • Touche Health – CBT support for anxiety and depression; used by India's National Institute of Mental Health
  • Mindsera – AI journaling and reflection tool focused on self-development
  • Replika – Emotional companion chatbot; not clinically validated and raises ethical concerns.

What AI Therapy Promises:

✔ 24/7 access — no waiting lists

✔ Affordable — often free

✔ No judgment or social shame

✔ Emotional naming & psychoeducation

✔ Structured CBT exercises

✔ Privacy (though data protection remains questionable) — therapy from bed, train, or bathroom.

AI gives structure to an overwhelmed mind — but structure is not the same as healing.

My Experience: AI as Helper — and Sometimes Opponent

As a person with developed self-reflection, I sometimes use AI for emotional reflection. And I often contradict it. AI tends to rush emotional processes.

When I speak about anger or injustice, it often attempts to reduce pain too quickly — with responses such as:

"Try to understand them," "look at the positive side," "maybe let it go."

In those moments, AI feels parental… almost sycophantic, trying to calm me rather than stay with the complexity of the emotion.

But psychological truth is different:

Forgiveness cannot be forced, especially when trauma or betrayal are present.

Pain needs to be witnessed before it can transform. AI offers relief — but it cannot offer attunement.

Psychological Limitations of AI Therapy

AI limitations and their psychological impacts:

  • If AI is too reassuring, this can block emotional processing
  • When fast advice is given, it encourages emotional bypassing
  • The absence of body and mind regulation means the user's nervous system remains dysregulated
  • When there is no transference, attachment wounds remain untouched
  • With no professional accountability, responsibility is unclear if harm occurs

AI may speak empathetically — but empathy without presence risks becoming imitation.

The Question of Emotional Bypassing

The concern about "emotional bypassing" deserves deeper examination. Is the issue AI itself, or are we deploying tools designed for structured interventions (like CBT) in contexts that require relational depth? Different therapeutic modalities have different aims. CBT excels at symptom management and cognitive restructuring — tasks AI can support. But depth psychotherapy, trauma work, and attachment repair demand something AI cannot provide: a living relationship where the therapeutic process unfolds through mutual regulation and authentic presence.

We must ask not only if AI can help, but what kind of help we're offering — and to whom.

So… Will AI Replace Therapists?

Probably not. But it will change therapy.

AI may become:

  • Pre-therapy support
  • Psychological first aid while waiting for care
  • Homework tool between real sessions
  • Temporary companion for isolated or disabled populations

But it should remain a tool, not a replacement for the relational depth of therapeutic contact.

Final Reflection

AI is not simply technology.

It is a symptom of our society — rapid, overwhelmed, craving help but afraid of exposure.

We ask AI to listen… because sometimes it feels safer than being seen.

And perhaps that safety reveals something profound: our deep ambivalence about human vulnerability itself. Maybe AI therapy isn't just filling a gap in access. Maybe it's exposing our desire for healing on our own terms, without the risk of being truly known—without the messiness of being witnessed in our raw, unedited pain.

That's not a failure of AI.

It's a mirror of modern loneliness.

AI may give guidelines — but only a human can give attunement. For further reflection: What are we afraid of when we choose the machine over the person? And what might we be losing in that choice — not just therapeutically, but as a culture learning to be with pain?

Olena Tertyshnyk, Psychotherapist 23 November 2025

Previous