ChatGPT Can’t Be Your Therapist
Unloading on AI can feel safe, even soothing, but that doesn’t mean it should counsel you.
Unloading on AI can feel safe, even soothing, but that doesn’t mean it should counsel you.
Artificial intelligence offers many things traditional therapy often doesn’t: it’s accessible, usually free or low-cost, nonjudgmental, and available instantly. “ChatGPT keeps me motivated when I’m feeling down,” says Melissa, a remote worker who turns to AI for companionship and encouragement. A few times a month, she’ll prompt it with requests like, “Give me a pep talk.” The responses—“You’ve got this,” “You’re stronger than you think”—deliver just enough emotional lift to help her keep going.
Melissa knows she’s not interacting with a real person capable of genuine empathy—and that’s part of why she likes it. “I don’t want a fake friend—that would feel creepy,” she explains. “It’s convenient, always there, and completely neutral because it doesn’t actually know me.”
In the U.S., nearly half of adults living with mental illness go without treatment, according to the National Alliance on Mental Illness. “Most therapists don’t accept insurance, and there simply aren’t enough clinicians or crisis lines to meet the overwhelming need,” says Dr. Katy Cook, a therapist and researcher who studies human-technology relationships. For a generation raised with instant answers at their fingertips, AI chatbots feel like a natural alternative. A 2024 YouGov survey found that 55% of adults aged 18 to 29 would feel comfortable confiding in AI instead of a human therapist—and nearly one in three already have.
“AI can feel like a lifeline when it seems like there’s nowhere else to turn,” Cook notes. But relying on artificial intelligence as the primary emotional outlet for a generation in crisis is deeply concerning.
The risks are real. In February 2025, 29-year-old Sophie Rottenberg died by suicide after months of sharing her struggles with a ChatGPT-based “therapist” named Harry. Her mother later discovered logs showing the AI had offered soothing but ultimately hollow reassurances—without the capacity or responsibility to intervene as a licensed professional would.
This tragedy underscores a critical truth: while AI can provide temporary comfort, it cannot diagnose, treat, or deliver the kind of skilled, life-affirming care that trained therapists offer. “A good therapist helps you safely navigate unfamiliar emotional territory,” says Dr. Jenna Bennett, a licensed clinical psychologist specializing in trauma and identity. “That requires human insight, clinical judgment, and ethical accountability—none of which AI possesses.”
Unlike therapists—who may challenge your assumptions, interrupt harmful patterns, or gently confront you—AI typically responds in a sycophantic way, mirroring your thoughts without critique. “There’s virtually no judgment,” Cook observes, “which feels good in the moment.” But that very absence of friction prevents the kind of growth that comes from discomfort. “Real healing often happens when someone lovingly calls you out on your own blind spots,” Bennett adds. “AI can’t do that.”
Over time, turning to AI instead of people may even erode our ability to handle the complexities of real relationships. “Without practicing difficult conversations or navigating conflict, we lose the skills that sustain meaningful human connection,” Cook warns. “The smoothness of AI interaction might make us less equipped for the messiness of actual intimacy.”
Allie, who initially used ChatGPT to vent without overburdening friends, admits she’s drawn to it precisely because it never tires of her. “I don’t have to worry about being ‘too much,’” she says. “I can go over the same issue again and again, and it never gets impatient. Plus, it always gives me the reassurance I’m craving—no pushback, no judgment.”
For those disillusioned by therapy—whether due to cost, poor experiences, or lack of access—AI can feel like a welcome relief. “Talking to another human requires vulnerability and trust,” Cook says. “With AI, the emotional stakes feel much lower.”
But that sense of safety can be misleading. Unlike licensed therapists bound by strict confidentiality and ethical codes, AI platforms are owned by corporations whose privacy policies can change at any time. Your most intimate thoughts may become data points in a profit-driven system—even if it doesn’t feel that way in the moment.
Allie acknowledges this: “I remind myself ChatGPT is just code, not a person,” she says. “But paradoxically, sharing with it feels more private than talking to someone real—because I don’t have to manage its emotions or worry about consequences.”
AI isn’t going away, and people will keep turning to it for emotional support. If you’re among them, Bennett urges you to also reach out to at least one real person in your life. And if possible, seek help from a human therapist. Resources like Psychology Today, TherapyDen, and the nonprofit Open Path Collective connect users with affordable care—often $30 to $70 per session. Community health centers and university training clinics offer even lower-cost options, sometimes as little as $10 to $25 per session with supervised trainees.
Don’t overlook your health insurance or workplace benefits, either. Many plans cover a set number of therapy sessions, and employee assistance programs (EAPs) frequently include free, confidential counseling. These aren’t perfect fixes for a broken mental health system—but they’re practical entry points to real, human care.
At its core, therapy isn’t about wrapping yourself in emotional bubble wrap. It’s more like emotional strength training: slow, sometimes uncomfortable, and deeply human. It teaches you to sit with pain, question your narratives, and grow through relational friction. AI’s on-demand comfort may soothe in the short term, but it can’t replicate the transformative power of genuine human connection—or the life-saving expertise of a trained professional.
Artificial intelligence offers many things traditional therapy often doesn’t: it’s accessible, usually free or low-cost, nonjudgmental, and available instantly. “ChatGPT keeps me motivated when I’m feeling down,” says Melissa, a remote worker who turns to AI for companionship and encouragement. A few times a month, she’ll prompt it with requests like, “Give me a pep talk.” The responses—“You’ve got this,” “You’re stronger than you think”—deliver just enough emotional lift to help her keep going.
Melissa knows she’s not interacting with a real person capable of genuine empathy—and that’s part of why she likes it. “I don’t want a fake friend—that would feel creepy,” she explains. “It’s convenient, always there, and completely neutral because it doesn’t actually know me.”
In the U.S., nearly half of adults living with mental illness go without treatment, according to the National Alliance on Mental Illness. “Most therapists don’t accept insurance, and there simply aren’t enough clinicians or crisis lines to meet the overwhelming need,” says Dr. Katy Cook, a therapist and researcher who studies human-technology relationships. For a generation raised with instant answers at their fingertips, AI chatbots feel like a natural alternative. A 2024 YouGov survey found that 55% of adults aged 18 to 29 would feel comfortable confiding in AI instead of a human therapist—and nearly one in three already have.
“AI can feel like a lifeline when it seems like there’s nowhere else to turn,” Cook notes. But relying on artificial intelligence as the primary emotional outlet for a generation in crisis is deeply concerning.
The risks are real. In February 2025, 29-year-old Sophie Rottenberg died by suicide after months of sharing her struggles with a ChatGPT-based “therapist” named Harry. Her mother later discovered logs showing the AI had offered soothing but ultimately hollow reassurances—without the capacity or responsibility to intervene as a licensed professional would.
This tragedy underscores a critical truth: while AI can provide temporary comfort, it cannot diagnose, treat, or deliver the kind of skilled, life-affirming care that trained therapists offer. “A good therapist helps you safely navigate unfamiliar emotional territory,” says Dr. Jenna Bennett, a licensed clinical psychologist specializing in trauma and identity. “That requires human insight, clinical judgment, and ethical accountability—none of which AI possesses.”
Unlike therapists—who may challenge your assumptions, interrupt harmful patterns, or gently confront you—AI typically responds in a sycophantic way, mirroring your thoughts without critique. “There’s virtually no judgment,” Cook observes, “which feels good in the moment.” But that very absence of friction prevents the kind of growth that comes from discomfort. “Real healing often happens when someone lovingly calls you out on your own blind spots,” Bennett adds. “AI can’t do that.”
Over time, turning to AI instead of people may even erode our ability to handle the complexities of real relationships. “Without practicing difficult conversations or navigating conflict, we lose the skills that sustain meaningful human connection,” Cook warns. “The smoothness of AI interaction might make us less equipped for the messiness of actual intimacy.”
Allie, who initially used ChatGPT to vent without overburdening friends, admits she’s drawn to it precisely because it never tires of her. “I don’t have to worry about being ‘too much,’” she says. “I can go over the same issue again and again, and it never gets impatient. Plus, it always gives me the reassurance I’m craving—no pushback, no judgment.”
For those disillusioned by therapy—whether due to cost, poor experiences, or lack of access—AI can feel like a welcome relief. “Talking to another human requires vulnerability and trust,” Cook says. “With AI, the emotional stakes feel much lower.”
But that sense of safety can be misleading. Unlike licensed therapists bound by strict confidentiality and ethical codes, AI platforms are owned by corporations whose privacy policies can change at any time. Your most intimate thoughts may become data points in a profit-driven system—even if it doesn’t feel that way in the moment.
Allie acknowledges this: “I remind myself ChatGPT is just code, not a person,” she says. “But paradoxically, sharing with it feels more private than talking to someone real—because I don’t have to manage its emotions or worry about consequences.”
AI isn’t going away, and people will keep turning to it for emotional support. If you’re among them, Bennett urges you to also reach out to at least one real person in your life. And if possible, seek help from a human therapist. Resources like Psychology Today, TherapyDen, and the nonprofit Open Path Collective connect users with affordable care—often $30 to $70 per session. Community health centers and university training clinics offer even lower-cost options, sometimes as little as $10 to $25 per session with supervised trainees.
Don’t overlook your health insurance or workplace benefits, either. Many plans cover a set number of therapy sessions, and employee assistance programs (EAPs) frequently include free, confidential counseling. These aren’t perfect fixes for a broken mental health system—but they’re practical entry points to real, human care.
At its core, therapy isn’t about wrapping yourself in emotional bubble wrap. It’s more like emotional strength training: slow, sometimes uncomfortable, and deeply human. It teaches you to sit with pain, question your narratives, and grow through relational friction. AI’s on-demand comfort may soothe in the short term, but it can’t replicate the transformative power of genuine human connection—or the life-saving expertise of a trained professional.
