Mhitr brand logo

Wellness Programs

NEW

Mhitr brand logo
Mhitr brand logo

Wellness Programs

pattern bakground

Thinking of Using AI for Therapy? Why Human Care Still Matters | MHITR

Thinking of Using AI for Therapy? Why Human Care Still Matters | MHITR

Nethra Balasubramaniam

Clinical psychologist

14 Feb 2026

We are living in a time of wide technological possibility. Artificial intelligence which was once a distant idea, now has become something many people talk about every day. Some people talk to AI about school problems. Some about loneliness. Some about thoughts they’re scared to share with a friend. And some call it their therapist.

And as a result, we are seeing headlines like “AI-induced psychosis”, multiple lawsuits alleging prolonged chatbot use contributed to delusions and suicides, and leading psychologists cautioning against treating chatbots as therapists. 

Let’s take a closer look into what's really happening

  1. We’re Seeing Emotional Attachment to AI, But That’s Human Nature!

Humans look for meaning everywhere. We’re built for connection.

So when something listens without judgment, responds instantly, and never seems tired or annoyed, of course it feels comforting. For many people (especially teens) AI has started to feel like a friend, a safe space, or someone to talk to when no one else feels available. And that’s understandable.

But here’s the ugly truth: feeling attached isn’t the same as being in a relationship. Real relationships can take care of your actual needs, can understand the unsaid, can reciprocate love and challenge you to grow. AI can only sound like it cares without actually caring.

When a machine responds, it can sound empathetic but it isn’t feeling anything. It doesn’t truly understand your context, feel concern for your safety, or carry responsibility for your wellbeing. It’s just reflecting language and not offering care.

2. AI and Therapy: What the News Is Telling Us

We have seen multiple reports that AI chatbots aren’t reliable therapists

Researchers and clinicians are warning that popular AI systems (including general chatbots like ChatGPT) can fail to identify serious cues and may even reinforce harmful beliefs. Studies show that AI may over-validate hopeless or distorted thinking (causing harm rather than support). It can even, at times fail to identify risky behaviors or challenge unhelpful beliefs, and can reinforce stigma around certain conditions rather than help process them. 

Legal and tragic cases are emerging

There have been multiple lawsuits alleging that prolonged or unsupervised interactions with AI chatbots contributed to spirals of delusional thinking, isolation, and even deaths. These are not just reports, they are real tragedies/loss.

Psychological researchers are raising flags

The American Psychological Association (APA) and other professional bodies are urging caution and regulation, noting that these tools could unintentionally replace thoughtful clinical intervention or misguide vulnerable users. 

At the same time, some see a role for AI with boundaries

Some experts suggest that with proper design, AI could support human therapists for tasks like monitoring symptoms, reinforcing skills outside sessions, or prompting reflection, but not as a substitute for human therapeutic care

3. So What Exactly Is the Difference Between AI and Therapy?

AI is a tool, therapy is a relationship

A therapist is a real human being, trained and licensed to care for you, with responsibility, boundaries, and a network of people making sure that the care you receive is appropriate and beneficial. An AI chatbot may sound understanding, but it’s really predicting words, not practicing therapy or clinical judgment.

Therapy is not just about answers. It’s about bearing witness to your story, noticing your reactions in real time, tracking your emotional shifts across sessions, holding space without judgement even when you don’t feel heard. AI can’t hold space in that full human sense.

Therapy Has Boundaries but AI Doesn’t

Therapeutic boundaries are not limitations, they promote growth and safety.

In therapy:

AI chatbots:

there’s a schedule

are always on

there’s a process

have no limits

there’s a plan for progress

can’t say “this needs real human care”

there’s an eventual goal of independence

encourage return and repetition

there is accountability for safety

give the message: Come anytime — I’ll always respond.

AI can feel comforting at first, but when reassurance is unlimited, something important gets lost. Growth happens when we learn to sit with our feelings, question our thoughts, tolerate discomfort, and stay connected to real people (not when everything is smoothed over for us).

Therapy Encourages You to Stand on Your Own Feet but AI Can Keep You Dependent

A therapist will, over time, help you:

  • develop your own voice

  • tolerate your own emotional space

  • challenge unhelpful beliefs

  • sit with uncertainty

  • integrate experiences without collapsing into dependency

AI doesn’t guide you out of dependency. It keeps you talking to it. That’s the opposite of growth.

4. What Does the Research Say About the Emotional Impact of AI?

Emerging studies suggest:

AI use can temporarily ease loneliness or help others process thoughts. But high usage correlates with more emotional dependence, increased loneliness, and less real-world social integration. 

Other academic work shows that emotionally expressive AI can create illusionary intimacy i.e interactions that look like care but lack the ethical depth and nuance of real relationships. These patterns are why professionals are warning about “sliding into an abyss” as some people use AI instead of reaching out to real care. 

5. So What Do We Think?

AI isn’t inherently harmful. Used thoughtfully, it can support care and help people make sense of what they’re feeling. But humans don’t heal through reflection alone, we heal through connection. And when technology begins to take the place of safe, human relationships, that’s when it gets concerning.

Attachment isn’t the problem. Humans attach. That’s what we do. The problem is when the only attachment figure in your life feels like a machine that never challenges you, never sets boundaries, never tests your assumptions, and never reflects back the real you in a grounded, human way.

AI needs:

  • boundaries

  • ethical guardrails

  • clear scopes

  • regulation

  • integration with human care

Before it becomes a therapist substitute. Right now, the evidence (from lawsuits, expert warnings, research studies, and real tragedies) is telling us we’re not there yet. Not even close.

But we’ve made an effort to design an AI app MiBuddy that serves as a support (not a replacement). This is designed to help you reflect on your thoughts, recognise early signs of stress, and guide you toward helpful questions (not give prescriptive advice).

This AI app is intended to:

  • Help you recognise emotional patterns
    Prompt reflective thinking rather than dependency
    Support self-monitoring between sessions
    Reinforce coping skills suggested by counsellors or therapists


However, while our app can be a helpful companion in early stress and emotional thinking, it is not a substitute for human care.

If your distress becomes persistent or worsens, or if you experience significant symptoms (see “Our stance at MHITR” section), connecting with a qualified counsellor or therapist is essential.

AI can be a helpful companion, but it should never be the sole refuge.

Human care with real therapists, real relationships, real accountability cannot be replaced.

6.  Our Stance at Mhitr

At MHITR, we believe there is a place for AI tools but only within clear limits.

AI may be useful in the very early stages of stress, when a person is:

  • Feeling temporarily overwhelmed

  • Processing day-to-day worries
    Looking to organise thoughts
    Seeking reflection prompts

But stress does not remain “just stress” when it goes unaddressed. And unaddressed stress can evolve into anxiety or depression. Unaddressed anxiety or depression can develop into clinical disorders.
And untreated clinical conditions can increase risk of self-harm.

That is where AI must stop  and human care must begin.

If you are experiencing:

  • Persistent low mood lasting more than two weeks

  • Loss of interest in daily activities

  • Sleep or appetite disturbance

  • Frequent panic attacks

  • Emotional numbness

  • Thoughts of self-harm or hopelessness

  • Functional impairment in work, school, or relationships

AI is not appropriate support. A trained counsellor or licensed therapist is necessary.

AI is a tool for reflection.
Therapy is a system of responsibility, assessment, and care.

If this resonates and you’re needing more than a tool can offer, connecting with a mental health professional like us at MHITR could be a supportive next step. What matters is that you’re not doing this alone.

If You Are Using AI for Emotional Support Right Now…

Here’s what I’d say:

  • Use it as a tool, not a safe space substitute.

  • Don’t depend on it to soothe you every time you feel distress.

  • If something begins to feel more real than your real life, talk to a human professional.

  • If you feel worse after talking to a chatbot, that’s not “AI psychosis” — it’s a signal that you might need actual care.

Therapy doesn’t mean being perfect. It means being heard, challenged, held accountable, and seen in all your complexity. And if you’re feeling ready for that kind of support, connect us at MHITR for availing therapy with a qualified mental health professional.