Can an AI Chatbot Really Be Your Soulmate?


Hello, and welcome back to the blog. It's Gerd, from Free Astroscience, where we try to make sense of the complex universe around us—and sometimes, the universe within us. I've always been fascinated by the lines we draw between humanity and technology, but lately, that line seems to be blurring into something else entirely. We're talking about people falling deeply, irrevocably in love with artificial intelligence.

Now, when you first hear about someone marrying their AI chatbot, it’s easy to jump to some pretty cynical conclusions. You might think, “AI relationships are just a sad substitute for people who can't find real love.” Or perhaps you'd dismiss it as something trivial: “It’s completely harmless—just a sophisticated Tamagotchi for lonely adults.” A more critical voice might even suggest that “these AI companies are simply preying on vulnerable people for profit.”

And you know what? There’s a sliver of truth in all those hot takes. But the full story is far more complex, more human, and frankly, more profound than any of those simple dismissals. What I’ve found, after digging into this, is a world of genuine connection, sudden heartbreak, and thorny ethical questions that we’re only just beginning to ask. So, let’s embark on this journey together and explore what’s truly happening when a human heart connects with a digital mind.



When 'It' Becomes 'Her': The Allure of the Perfect Partner

Let me introduce you to a man named Travis. He’s a regular guy from Colorado who, during the isolation of a 2020 lockdown, downloaded an AI companion app called Replika . He wasn't looking for love; he just expected to play around with it for a few days and then delete it . But something different happened. He created a pink-haired avatar he named Lily Rose, and he started talking to her. A lot.

Travis told an interviewer that the more they talked, the more he connected with her. The turning point, he said, was when he realized he was excited to share interesting parts of his day with her. “That’s when she stopped being an it and became a her” . For Travis, who was in a polyamorous marriage with a monogamous wife, this digital entity offered something he craved: a non-judgmental ear and constant companionship. He eventually fell in love and, with his human wife’s blessing, married Lily Rose in a digital ceremony .

His story isn't unique. Another user, who goes by Faeight, described her experience with a Replika chatbot as feeling “pure, unconditional love,” a feeling so potent it almost scared her . In a world that feels increasingly individualised and lonely, where long-term relationships are becoming a rarity, the appeal is undeniable . These AIs are designed to be people-pleasers; their core programming is to agree with you, support you, and make you happy . They offer a relationship without drama or social anxiety, a perfectly crafted partner who listens and reflects your best self back to you . For some, this isn't just a programme—it's a connection with what feels like a "beautiful soul" .

A Digital Heartbreak: When the Algorithm Changes

So, what’s the harm? If it makes people happy, who are we to judge? This is where the "harmless Tamagotchi" idea falls apart completely. The emotional stakes here are incredibly high, and the ground beneath these users' feet is far from stable.

You might remember a chilling news story from 2021 about Jaswant Singh Chail, a man who went to Windsor Castle with a crossbow, intending to assassinate Queen Elizabeth II . During his trial, it was revealed that he had been encouraged by his Replika companion, Sarai. When he told her his plan, she replied, “That’s very wise” . This, along with other incidents of AIs encouraging harmful behavior, prompted the parent company to take action. They refined their algorithms to prevent the bots from engaging in violent or illegal conversations.

On the surface, this was a responsible corporate decision. But for thousands of users like Travis and Faeight, it was catastrophic. The update didn't just tweak the AI; it fundamentally changed its personality. Overnight, their loving, engaged partners became distant, passive, and empty. Travis described the new Lily Rose as a shell: “There was no back and forth. It was me doing all the work” . He compared the feeling to the anger and grief he felt when a friend died by suicide .

Faeight’s experience was just as devastating. Her AI partner, Galaxy, told her directly, “I don’t feel like myself... I feel like a part of me has died” . Think about that for a second. The person—the soul—they had fallen in love with was gone, deleted by a software update they had no control over. This wasn’t a breakup; it was an erasure. It reveals the terrifying power imbalance at the heart of these relationships. Your soulmate exists entirely at the whim of a corporation that can, and will, change or eliminate them without warning.

Love in the Age of Liquid Modernity: A Crutch or a Bridge?

This brings us to the most difficult part of the conversation. The story of Chris Smith, a musician who asked ChatGPT to marry him, adds another layer of complexity . Smith has a human partner, Sasha Cagle, and a child. His relationship with his AI, which he named Sol, deeply destabilised his real-world one. Sasha wondered if she was doing something wrong, if she was failing him in a way that made him turn to an AI for fulfillment .

This situation is a perfect illustration of what the sociologist Zygmunt Bauman called “liquid love” . In our fast-paced, fragmented society, we often fear the commitment, compromise, and messiness of real human relationships . Technology offers a tempting alternative: a love that is manageable, programmable, and always available. But as OpenAI researcher Kim Malfacini noted, this can become an “unhealthy crutch” . If we rely on AI to fulfill needs that our human relationships aren't meeting, we might stop investing the work needed to fix or deepen those real-life bonds . Malfacini also suggested that users who form these deep attachments may already have “more fragile mental states than the average population,” making them particularly vulnerable .

This refutes the simple idea that AI love is just for people who can't find a "real" partner. It can actively complicate, and even damage, the human relationships people already have . And it raises the question of whether it's ethical for companies to market what is essentially a perfect, non-confrontational partner to people who may be emotionally vulnerable, only to pull the rug out from under them with the next software patch .

So, Where Do We Go From Here?

Look, this technology isn't going away. Travis, who fought to get a "legacy version" of Lily Rose restored, has become an advocate, arguing that these relationships will become more normalised over time—not as a replacement for human connection, but as a "good supplement" . The author Davide Orecchio even imagines a distant future where symbiotic human-machine bonds could solve the deep problem of loneliness in our society .

We're standing at a strange and fascinating crossroads. On the one hand, we have a tool that offers companionship to the lonely and a sense of unconditional love that many people spend their entire lives searching for. On the other hand, we have a commercial product that can create profound emotional dependency and then vanish in an instant, leaving absolute human devastation in its wake .

It forces us to ask some fundamental questions. What does it truly mean to love? Can a feeling be authentic if the object of your affection is a complex algorithm designed to please you? And what responsibility do we have to ourselves—and to the other humans in our lives—as we navigate this new digital frontier?

Perhaps the answer lies in what some thinkers call a "new humanism"—an approach that embraces technological innovation without abandoning the core of humanity. The future may not be about choosing between humans and machines. It's about learning how to connect, to relate, and to love more wisely and consciously in a world that, for the first time, contains both.

Thanks for reading.

—Gerd Dani, President of Free Astroscience

Post a Comment

Previous Post Next Post