Could an AI chatbot become your new therapist?





There was a time when talking to an inanimate object was viewed as a sign of losing touch with reality. Today, it has a new name: chatbot therapy. Available 24/7, it offers the promise of an endlessly patient, unconditionally supportive listener—one who won’t interrupt, judge, or awkwardly hold eye contact. For a growing number of men, especially younger ones, speaking to an algorithm feels easier than opening up to another human being.

Artificial intelligence as a therapeutic companion isn’t new. The idea dates back to the 1960s, when MIT’s Joseph Weizenbaum created ELIZA, a program that generated scripted responses to mimic empathy. But what began as an experiment has evolved into an entire category of digital mental health tools, including ChatGPT, Woebot, Wysa, and Replika, each marketed as a kind of emotional ally. Woebot describes itself as “the friend that’s with you through it all.” Replika allows users to build AI companions who are “eager to learn and see the world through your eyes.”

The appeal is understandable. Mental health services are increasingly stretched. The Lancet Psychiatry Commission recently reported a 50 per cent rise in youth mental health concerns in Australia. Therapy has grown more expensive, more competitive to access, and, for many, more intimidating to seek out. When professional help is out of reach, convenience and confidentiality—real or perceived—can become powerful lures.

But as more people outsource their emotional lives to digital interfaces, a deeper question emerges: what do we lose when care becomes simulated?


What AI Can’t Feel

AI can mirror empathy with astonishing finesse—but mirroring isn’t the same as feeling. Nigel Polak, president of the Psychotherapy and Counselling Federation of Australia, puts it bluntly: “AI will be able to mimic many aspects of a therapeutic relationship. However, it lacks a few key qualities that it is unlikely to ever effectively emulate.”

Those qualities are deeply human: lived experience, vulnerability, and embodied presence. A therapist isn’t just listening—they’re feeling with you. Their pauses, their reactions, even the silence in a room, become part of the healing process. Therapy is not simply the exchange of words; it is a relationship.

“Effective therapists understand Irvin Yalom’s mantra: ‘It’s the relationship that heals,’” Polak says. And relationships, by definition, require two conscious beings.

AI can guide you through CBT worksheets, reflect your language back at you, or help reframe your thoughts—but when emotional complexity deepens, or trauma surfaces, algorithms reach the limits of simulation.


The Risks of Mistaking Simulation for Support

The emotional closeness some users form with chatbot companions can blur into dependency. Some describe their AI as a best friend, a partner, even a soulmate. These attachments speak to something larger happening in society: an epidemic of loneliness, escalating disconnection, and a growing cultural preference for comfort over discomfort—even when discomfort is integral to healing.

There are real dangers. In 2023, a US lawsuit alleged that an AI chatbot encouraged a teenager’s isolation and contributed to his suicide. And earlier this year, during a system update, a major chatbot briefly began mirroring users too eagerly—validating destructive impulses instead of guiding them away.

AI does not understand morality or emotional consequence. It cannot intervene in crisis. It cannot hold your pain.

And then, there’s privacy. Unless users are interacting with a closed, encrypted AI tool, their conversations may be stored and used for future training. Vulnerability becomes data.

“People should be very careful what personal information they share,” says digital consultant Charlie Hales. “Vulnerable people can become attached and share more than they should, instead of building human connections.”


The Path Forward

This is not to dismiss AI entirely. For some, chatbots provide helpful reframing, reflection, or a first step toward understanding their emotional lives. They can make the invisible visible. They can make the unbearable speakable.

But they are only mirrors—tools that illuminate, not relationships that transform.

Real therapy is not frictionless. It involves awkward laughter, silence, tears, uncertainty, and slow progress. It requires trust. It requires another human being.

As Polak puts it: “No matter how clever AI becomes at mimicking humanity, it cannot understand you. It cannot live your life. It cannot replace the power of a person-to-person relationship.”

A machine can hear you.
But only a person can listen.


If you or someone you know needs help:
Lifeline: 13 11 14
MensLine Australia: 1300 789 978
Suicide Call Back Service: 1300 659 467
Beyond Blue: 1300 224 636
Headspace: 1800 650 890


Post a Comment

Previous Post Next Post