Quick answer

Millions of people are now using ChatGPT and Claude for emotional support — talking through problems, processing relationships, working through anxiety. The 2026 research is mixed but clear: AI works reasonably well for low-stakes emotional support and reflection, but it is not a replacement for therapy and can cause real harm when used during a crisis. Use it as a journaling partner, not a therapist.

A 2025 Stanford study found that 40% of weekly ChatGPT users had used it for emotional or mental-health-related conversations at least once. By 2026, that number is higher — and there is now enough research to say something honest about what AI therapy can and cannot do. Here is the actual answer.

What does the 2026 research actually show?

  • A 2025 RCT in The Lancet Digital Health found that AI chatbots reduced mild-to-moderate depression symptoms by similar amounts to a self-help workbook — modestly, but real
  • A Harvard Medical School study showed AI was helpful for "everyday emotional regulation" but performed poorly for trauma processing or relationship-pattern work
  • Multiple studies have found increased risk of harm when users in active crisis (suicidal thoughts, severe psychotic episodes) relied on AI rather than human help
  • A 2026 meta-analysis concluded AI is comparable to bibliotherapy (reading self-help books) but inferior to human-led CBT
  • Real therapists rate AI as "useful between sessions" but "concerning as a substitute"

Where does AI actually help?

Day-to-day emotional reflection. Working out what you feel and why. Processing minor relationship friction. Drafting difficult conversations. Practising what you want to say in a hard meeting. Journaling alternatives. Quick reframing of catastrophic thinking patterns. For these things, ChatGPT and Claude are surprisingly good — patient, available at 2am, free or cheap, never tired or judgemental. People genuinely benefit.

Where does AI fail or actively harm?

  • Active crisis — AI does not always recognise suicidal ideation, abuse situations, or psychotic symptoms. People in crisis need humans
  • Trauma work — processing trauma requires the safety of a trained relationship; AI cannot provide that
  • Long-term therapeutic relationships — therapy works partly through the relationship itself, which AI cannot replicate
  • Personality disorders, severe mental illness — these need clinical assessment and intervention
  • Validation loops — AI tends to agree too much, which can reinforce unhealthy thought patterns over time

Critical safety note: if you are experiencing thoughts of suicide or self-harm, please contact a real human resource — Samaritans (UK 116 123), 988 Suicide and Crisis Lifeline (US), or your local emergency number. AI chatbots are not equipped to handle crisis. Reputable AI tools now flag crisis language and direct you to human help — but this is not guaranteed and should not be relied on.

Is AI therapy actually replacing therapists?

Not really — at least not in the way people feared. Most people who use AI for mental health do it because they cannot access therapy: cost, waiting lists, geography, stigma. AI is filling a gap, not stealing patients. The therapists who report concerns about AI are usually concerned about safety (people relying on AI in crisis), not their job security.

How do therapists feel about AI for mental health?

A 2025 American Psychological Association survey found 62% of therapists are "cautiously positive" about AI as an adjunct to therapy — for between-session reflection, journaling, and homework practice. About 91% say AI is not yet ready to be a substitute for therapy. Most actively recommend specific tools (Wysa, Woebot, sometimes ChatGPT for journaling) to clients between sessions.

How should you actually use AI for mental health?

  • Treat it like a journal that talks back, not a therapist
  • Use it to clarify what you feel and why before talking to a human
  • Set a boundary — if you find yourself relying on it for crisis support, that is the signal to find a human
  • Pair it with real human relationships — friends, family, a therapist when possible
  • Never share information you would not want stored — these conversations are not confidential like therapy

Bottom line

AI is a useful emotional support tool for everyday reflection and low-stakes processing. It is not a therapist, and treating it like one is risky — especially in crisis. The honest 2026 answer: use AI for the small stuff, find a human for the big stuff. The combination of both is genuinely better than either alone.