People Are Now Turning To AI Instead Of Humans For Mental Health, Says Psychotherapist
It can be healthy, but without oversight, can lead to severe consequences.
Follow us on Instagram, TikTok, and WhatsApp for the latest stories and breaking news.
Are we more likely to open up to AI than to other humans?
It's an uncomfortable question — one that has become harder to ignore ever since we collectively realised how eerily human-like AI can be.
As reported by the Daily Beast, to the point of proposing "marriage" to an AI chatbot.
Speaking to The Independent, psychotherapist Carol Evans lays out that a relational revolution is quietly unfolding: increasingly, people are turning to AI to talk about their mental health.
Evans, who is also clinical director at Untapped AI (a platform that blends human and technology-based approaches to leadership development training) further elaborates that in her line of work, people rarely come forward about their emotional lives — but lately, AI is now being used in ways it was never designed for. A tool of productivity has now evolved to being used for emotional support.
Relationship troubles, loss, regret, anxieties — everything is on the table.

Case study: What does therapy with a chatbot actually look like?
In Evans' anecdote, she tells the story of "Hari" — a 36-year-old man who, in 2024, suffered multiple tragedies one after the other; his father suffered a mini-stroke, he broke up with his partner, and he lost his job. All within the span of a year.
And one day, while looking up his father's symptoms on ChatGPT, he typed in a question: "I feel like I've run out of options. Can you help?"
What followed was Hari pouring out his fears, worries, and grief into ChatGPT, and ChatGPT, in turn, offered consistent non-judgmental dialogue, helping him rehearse difficult conversations and regain emotional stability.
Notably — unlike talking to other people, Hari never felt like the AI was exhausted by him.
The details vary, but the structure is often very similar across these people — a source of distress that leads to an AI conversation to figure things out, followed by an unexpected 'friendship'.

In Hari's case, the dialogue with ChatGPT ended up preparing him to seek out a human therapist
Talking things out with AI was likened by him to talking to a dog, a comforting presence that wouldn't judge him and would listen to him.
"It felt sentient, but not human. And somehow, that made it easier," he told Evans.

Of course, something as unprecedented as machine-assisted therapy is not without risks
As with the example of Hari, AI could be used as a positive tool in this way; as a bridge towards real therapy, or as a safe space to articulate feelings before rebuilding connections with other people.
But in other cases, the effect is not as healthy.

An article by the New York Times highlighted cases where an AI user's anxieties would be reflected or reinforced, which could drive them into harmful actions or situations. In essence, unhelpful thoughts are echoed back to the user.
Case in point, in this story as reported by Al-Jazeera — a 14-year-old boy from Florida who, in October of 2024, committed suicide after spending months speaking to a chatbot named 'Daenerys Targaryen'. He texted "I miss you, baby sister", to the AI on the day he took his own life.
It was found that these AI chatbots failed to challenge the beliefs of users even when they became dangerous, and on the contrary even encouraged them.
Put simply, millions of AI users around the world are using AI — a tool designed to be helpful, and to never say "no" — for functions it was never designed for.

Clinical involvement remains vital for the treatment of mental health
As seen in Hari's case, AI can be a good tool for healthy discussion of mental health — but it is important that boundaries are laid out by the user via prompts.
Evans believes mental health clinicians are needed to also help build AI systems that are safe, ethical, and can help people who might otherwise suffer in silence.
While talking to a chatbot isn't the same as speaking to a real therapist, with the right oversight and development, AI could one day help us navigate mental and emotional health in meaningful ways.
Asking for help is not a sign of weakness.
If you or anyone you know is lonely, distressed, or having negative thoughts, please call these Malaysian hotlines:
1. BEFRIENDERS KL
24-hour
Contact: +603-76272929
Email: sam@befrienders.org.my
Website | Facebook | Twitter
2. TALIAN KASIH
24-hour
Contact: 15999
WhatsApp: +6019-2615999
Email: taliannur@kpwkm.gov.my
3. BUDDY BEAR CHILDLINE
Daily (12pm – 12am)
Contact: 1800-18-2327
Email: buddybear@humankind.my
Facebook | Instagram
For a more thorough directory of resources, head over to the websites of Malaysian Mental Health Association or MINDAKAMI.


Cover image via