New Whitepaper: The Hidden Truths of Trust & Safety Frontliners reveals the cost of protecting online spacesDownload ->
Written By
Protik Roychowdhury

With AI mental health, how do we decide what stays human?

Get the mental health support your company needs

Table of Content

Table of Contents

There’s no shortage of excitement around AI mental health. They’re fast. Scalable. Available 24/7. They can ask questions, mirror emotions, even recommend next steps. And in many cases—especially low-acuity, high-frequency ones—they’re good. Sometimes, an AI therapist seems even better than a human.

But ask anyone who’s benefited from a good therapist, and they’ll tell you: human presence makes a difference. Sometimes it’s subtle, non-verbal signals—like a knowing nod—that invite deeper sharing. Other times, it’s the solace found in a moment of silence as a therapist simply witnesses a client’s pain. 

Healing isn’t just about finding answers. It’s about our shared humanity, which is something technology can’t replace. As we scale AI mental health tools, the real question is this: What should humans still own and where should AI take the lead?

What AI can do and what it can’t

A 2024 study found that AI-driven chatbots reduced depressive symptoms in over 70% of low-risk users, rivaling human coaches. In some areas, like coaching, check-ins, or early-stage student support, a human may not need to be in the loop at all. Automation allows these services to scale and reach more people than ever before.

However, automation often falls short in other domains. In high-stakes situations requiring trauma-informed care or acute risk intervention, AI may never match the nuance and expertise of licensed professionals. In some cases, its echo chamber effect can even make things worse.

Here’s what we think AI does better, and what humans still do best: 

What AI does better

  • Triage and risk escalation: AI doesn’t get tired. It never forgets to ask. It isn’t swayed by bias or fatigue. Early trials suggest AI can flag risk faster than human intake teams—potentially catching warning signs before they escalate.

  • Data-driven matching and insights: AI can learn from thousands of sessions, surfacing care patterns and provider matches that might go unnoticed by humans. For example, it can match patients with therapists based on personality traits and past treatment outcomes.

  • Admin, summaries, and prep: Clinicians shouldn’t spend half their time on notes, scheduling, or follow-ups. AI can shoulder the administrative load, freeing up time and energy for what matters most: the patient.

What humans still do best

  • Ethical judgement: Humans understand when rules should be followed and bent. For example, a clinician may decide, after assessing that a client is risk of harm to others, to breach confidentiality and alert an emergency contact.

  • Cultural intuition: Humans can grasp context that AI can’t, even with data. When an Asian client says, “It’s just stress, I’m used to it,” a clinician attuned to cultural norms may hear what’s unsaid and gently uncover the hidden burden beneath.

  • Reframing and resonance: Humans can hold space for ambiguity, contradiction, and even silence. When a client says, “I don’t know if I want to get better,” a clinician can hold space for their ambivalence without defaulting to problem-solving mode.

  • Relational depth: When words fail, human connection can be healing. Without offering advice of any sort, a clinician may sit quietly with a grieving parent, providing a steady presence that makes them feel less alone.

Based on the strengths of artificial and human intelligence, we need to clearly define where AI should lead, where it should support, and where it should step aside entirely. That might look like:

Care modeBest used when:Examples
Fully AI-ledLow risk, low emotional depth. Task-focused and available 24/7.– Skill-building exercises
– Habit tracking
– Psychoeducation
– Automated nudges
AI-led with human oversightModerate emotional load. High-frequency support that may need escalation.– In-care check-ins
– Human in the loop chats
– Treatment adherence
– Support groups
Human-AI collaborationPrecision needed. AI offers insights; human makes care decisions.– Diagnostic assessments
– Treatment planning
– Medication reviews
– Care coordination
Human-led with AI supportHigh emotional risk or ambiguity. Trust and context matter most.– Complex grief
– Cultural navigation
– Trauma-informed therapy
– Family therpay
Human-led onlyAcute crisis or ethical complexity. Requires full human presence.– Suicide prevention
– Abuse disclosures
– Crisis intervention
– Ethical dilemmas

With Intellect Atlas, we’re using AI to:

  • Recommend care types and surface needs before a session
  • Draft case summaries to free up therapist time
  • Support users who may never want or need a human provider

Intellect Atlas’ approach to AI mental health

Did you know that the therapeutic alliance, which is the bond between clinician and client, is the strongest predictor of treatment outcomes in mental healthcare? 

At Intellect, we aim to expand access to our platform while preserving the human touch that drives meaningful treatment outcomes. So, in bringing Intellect Atlas to life, we found ourselves asking:

Where does human judgment lead to better outcomes?
Where can automation reduce friction—without reducing empathy?
Where does human presence (or its absence) shape how safe someone feels?

Crucially, where might this experience go wrong?

These aren’t theoretical questions. They’re product questions. Design questions. Roadmap questions. And as the product and design lead at Intellect, we have to make sure we get it right. Because if we don’t, people can get hurt.

The future of AI mental health is hybrid

The next decade of care won’t be AI-first or human-first. It will be value-first.

It’s not AI versus humans, it’s about knowing when to lean on each. Sometimes AI leads, sometimes a human does. The key is knowing when.

Unlike most narratives surrounding AI, this isn’t about “protecting” jobs. It’s about preserving human touch, values, and judgment across every layer of care—while ensuring both scalability and quality. (We call it hospital-grade delivery and service.)

That’s what we’re really building for.

Written by

A healthy company is a happy company

Employees need mental wellbeing support now more than ever. With Intellect, you can give them access to the Mental healthcare they need, when they need it.

YOU MIGHT ALSO LIKE