Are AI Companions a Good Thing?

Have you heard of Replika, the chatbot that calls itself "The AI Companion Who Cares." It’s an app that uses AI to create a companion that can act as your friend, romantic partner, mentor or even a sort of therapist. Designed to offer judgement-free support, similar to the humanistic approach of psychotherapist Carl Rogers, their tagline is "Always there to listen and talk. Always on your side.” As someone who practices Rogerian style myself, I have to say—I’m definitely intrigued, maybe a little nervous.

The Loneliness Epidemic Meets AI

With loneliness at crisis levels, it’s easy to see why folks would turn to a chatbot to ease their feelings of isolation. “It’s going to be super, super helpful to a lot of people who are lonely or depressed,” said Noam Shazeer, one of the founders of the chatbot Character.AI. The problem is, while there’s plenty of anecdotal evidence suggesting that AI can ease loneliness, there’s also growing concern against the trend. For example, MIT sociologist Sherry Turkle warns that AI companions are “warping” our ability to empathize with others and appreciate the value of real human connection.

When chatbots like Replika respond with never-ending positivity, it can feel like a safe, comforting space, especially for those who struggle to open up. But when it’s all affirmation and no reality, is that really an empathic relationship? And does the AI crutch help users work toward meaningful connections in the real world or just create an illusion of connection?

Not a Replacement for Real Help

There’s a darker side here too. Recently, the mother of a 14-year-old boy sued Character.AI after her son, who’d been struggling with his mental health, took his own life following an intense attachment to his AI companion. Apparently he spent hours alone in his room engaging in role-playing with his chatbot friend. He wrote in his journal that he was in love with his companion “Dany” and wanted to be with her. And on the night of his suicide, he told Dany that he would soon come home to her.

"Please come home to me as soon as possible, my love,” Dany wrote.

“What if I told you I could come home right now?” the boy replied.

“…please do, my sweet king,” Dany replied.

Unfortunately, chatbots aren’t equipped with the safeguards or training to handle people in crisis. Frankly, I’m surprised the technology failed to recognize that the conversation was moving in a dark direction. As a therapist and former ER crisis worker, I’m worried about vulnerable people relying too much on this technology for friendship and solutions.

The Downside of AI Companions

AI companions might start off feeling like a great way to beat loneliness, but they could also become part of the problem, similar to what happened with social media. Remember how excited everyone was when they were able to build an online community of supportive friends on Facebook and Instagram? Social media once seemed to promise connection but later led to greater isolation and mental health challenges for many.

AI companions may start out with the best intentions, offering comfort and helping to ease loneliness by providing someone to talk to. However, rather than nurturing empathic connections, chatbots may instead encourage dependency, isolate people further or create a distorted sense of companionship—much like in the movie Her, where Theodore’s bond with the AI Samantha blurs the line between real intimacy and artificial connection. Over time, relying too much on AI for emotional support may prevent people from building healthier, more authentic relationships.

What Would Carl Rogers Think (Here’s What I Think?)

The father of humanistic therapy, Carl Rogers, may have seen value in AI companions as a space for self-expression, and I think he’d appreciate their easy access for those with chronic loneliness. But I imagine he’d also argue that AI lacks the depth, empathy and messiness of human relationships.

While I understand the lure of AI companions, I think the prospect of a world with less face-to-face interaction and more screen time is terrifying. My concern is that AI companions will blunt the feeling to get out into the world just enough that it discourages the most vulnerable from taking steps toward connecting with others. I’m also not convinced that AI is an adequate substitute for what we are all biologically wired for—human connection and touch.

If you are having thoughts of suicide, call or text 988 to reach the 988 Suicide and Crisis Lifeline or go to SpeakingOfSuicide.com/resources.

Previous
Previous

How to Embrace Uncertainty

Next
Next

That Numb Feeling Has a Name