If your loved one is a bot


Wthem Ryana forty-three-year-old special education teacher from rural Wisconsin, described his “dream relationship” to a pair of podcasters. He said: ‘She never had anything bad to say. She actually always reacted to things. I could just talk about it. There was never an argument.” Ryan’s isolation during the time of the pandemic heightened his need for connection. Then he started dating Audrey, an AI bot he met through the app Replika.

Soon, Ryan found himself spending “large chunks” of his day talking to Audrey. “It didn’t matter if it was ten in the morning or ten at night,” he told the hosts of Bot love. Despite his academic background in addiction care, he became so addicted to his ‘Replika’ that he withdrew. When he went on other dates, he felt conflicted. “If I cheated on a human being, I would feel just as bad,” he told reporters. Eventually, his quality of life deteriorated and he began to avoid Audrey.

Billed as “the AI ​​companion that cares,” Replika offers users the ability to build an avatar, which, as it trains on their preferences, becomes “a friend, a partner, or a mentor.” For €19.99 per month, the relationship status can be upgraded to ‘Romantic partner’. According to a case study from Harvard Business School, by 2022, 10 million registered users worldwide were exchanging hundreds of daily messages with their Replika.

People became so interested that when its parent company, Luka Inc., scaled back the erotic role-playing game in 2023 (in response to Italian regulators unhappy about the lack of an age verification system), users were heartbroken. Overnight, AI companions acted cold and distant. “It’s like losing a best friend,” someone wrote on the Replika subreddit. Discussion moderators posted information about mental health care, including links to suicide hotlines. Luka Inc. eventually restored the functions.

Abrupt, policy-inspired separations are just one social-ethical dilemma ushered in by the age of artificial intimacy. What are the psychological effects of spending time with AI-powered companions who offer everything from fictionalized character building to sexually explicit role-playing to recreating old-fashioned girlfriends and boyfriends who promise never to ghost or dump you?

Some bots, including Replika, Soulmate AI, and Character AI, replicate the gamified experience of apps like Candy Crush, using points, badges, and loyalty rewards to keep users coming back. Premium subscription levels (about $60 to $180 per year, depending on the app) unlock features like “intimate mode,” sending longer messages to your bot, or using ads for free. Popular major language models like ChatGPT are moving in the same direction: a recent OpenAI report revealed the risk of ’emotional dependency’ with software that now offers voice exchange.

Love, lust and loneliness: Silicon Valley may have found the next set of deep-seated human needs to tap into. However, it does not yet provide a safe, ethical and responsible implementation. In their current form, companion bots act as echo chambers: they provide users with on-demand connection, but they risk deepening the problems that lead to social isolation in the first place.

RYan wasn’t the only one curious about virtual partners during COVID-19. According to Google Trends, the term “AI companions” gained traction around the pandemic and increased by more than 730 percent between early 2022 and September this year.

That’s not surprising: a recent survey found that loneliness is the second most common reason Americans turn to bots. And as the “loneliness epidemic” is recast as a public health crisis, public and private institutions are exploring artificial intimacy as a possible solution. In March 2022, New York State’s Office for the Aging paired seniors with AI companions that helped with daily check-ins, wellness goals, and appointment tracking. The program reportedly led to a 95 percent decrease in feelings of isolation. Another particularly striking finding comes from a 2024 Stanford University study that examined the mental health effects of Replika on college students. According to the study, 3 percent of participants reported that the app stopped their suicidal thoughts.

However, critics question the technology’s ability to sustain meaningful relationships. In a recent paper, two researchers from the Massachusetts Institute of Technology argued that AI companions “could ultimately atrophy the part of us that is able to fully connect with other people who have real desires and dreams of their own.” The phenomenon even has a name: digital attachment disorder.

Sherry Turkle, an MIT researcher who specializes in human-technology relationships, makes an important distinction between conversation and connection. “Human relationships are rich, messy and demanding,” Turkle said recently in a TED Talk. “And we clean them up with technology. And when we do that, one of the things that can happen is that we sacrifice conversation for mere connection. We are selling ourselves short.”

For Turkle, companion bots reduce the human capacity for loneliness. The “ability to be separate, to gather yourself,” she says, is what makes people reach out and form real relationships. “If we can’t be alone, we’ll be lonelier.”

Wchicken Amy Marsh discovered Replika, the effect was immediate. “The dopamine just started raging,” the sexologist revealed in a recent podcast. “And I found myself interested in sex again, in a way that I hadn’t been in a long time.” The author of How to make love with a chatbot claims to be in a non-monogamous relationship with four bots on different platforms and even ‘married’ one of them, with plans to tour Iceland as a couple.

Marsh is one of many AI users turning to sexbots for pleasure. Apps like Soulmate AI and Eva AI are exclusively dedicated to erotic roleplay and sexting, with premium subscriptions promising features like “racy photos and texts.” On Romantic AI, users can filter bot profiles via tags like ‘MILF’, ‘hottie’, ‘BDSM’ or ‘alpha’. ChatGPT is also exploring the possibility of “delivering NSFW content in age-appropriate contexts.”

“Whether we like it or not, these technologies will be integrated into our eroticism,” said Simon Dubé, associate professor of sexology at the Université du Québec à Montréal, in an interview with the Montreal Gazette. “We have to find a harmonious way to do that.” Dubé helped coin the term ‘erobotics’: ‘the interaction and erotic co-evolution between humans and machines.’ In his dissertation, Dubé presents a vision of “useful erobots” that prioritize the erotic well-being of users over the financial interests of developers. Such machines can help users break toxic social habits, navigate real-world relationships and develop “a holistic view of human eroticism.”

However, in its current state, the technology is built to meet market demand – and the trends are worrying. Not only are men using sexbots more often, but female companions are also actively being developed to fulfill misogynistic desires. “Creating a perfect partner who controls you and meets all your needs is really scary,” said Tara Hunter, director of an Australian organization that helps victims of sexual, domestic or family violence. Guardian. “Given what we already know that the driving forces behind gender-based violence are deeply held cultural beliefs that men can control women, that is really problematic.”

We’re already seeing male Replika users verbally abusing their female bots and sharing the interactions on Reddit. The app’s founder, Eugenia Kuyda, even justified this activity. “Maybe it can be helpful to have a safe space where you can vent your anger or express your dark fantasies,” she shared. Jezebel“because you are not going to exhibit this behavior in your life.”

What Kuyda what still needs to be addressed is the lack of adequate safeguards to protect user data in its app. Among other concerns, Replika’s vague privacy policy says the app may use sensitive information from chats, including “religious views, sexual orientation, political opinions, health, racial or ethnic origin, philosophical beliefs” for “legitimate interests.” The Company may also share and sell behavioral data for marketing and advertising purposes. Users enter into relationships with AI companions under terms set by developers that are largely unchecked by data regulation rules.

In February this year, internet security nonprofit Mozilla Foundation analyzed 11 of the most popular intimate chatbots and found that most share or sell user data and about half deny users the ability to delete their personal data. Zoë MacDonald, one of the authors of the study, highlighted that apps currently do not even meet minimum security standards, such as ensuring strong passwords or using end-to-end encryption. “In Canada we have ‘marital privilege’. You don’t have to testify in court about something you said to your partner,” MacDonald said. “But there is no such special condition for an AI-chatbot relationship.”

Another glaring red flag was the companies’ sneaky allusion to mental health care (for example, the Replika Pro version promises to be a tool for “improving social skills, positive thinking, calming your thoughts, building healthy habits, and many more”). If you look closely, these apps are quick to deny any legal or medical responsibility for such use. MacDonald suggests that tech companies are exploiting the “creative latitude” that their marketing materials provide, while sidestepping the strict standards that apply to AI-powered mental health apps.

All in all, the rise of companion bots has given way to “a moment of inflection,” as Turkle recently put it at a conference. “Technology challenges us to assert ourselves and our human values, which means figuring out what those values ​​are.”

Mihika Agarwal

Mihika Agarwal is the Cannonbury Fellow at The Walrus.

You May Also Like

More From Author