Fictional humans have been falling in love with robots for decades, in novels like Do Androids Dream of Electric Sheep? (1968), The Silver Metal Lover (1981), and films like Her (2013). These stories have allowed authors to explore themes like forbidden relationships, modern alienation, and the nature of love.
When those stories were written, machines were not quite advanced enough to spark emotional feelings from most users. But recently, a new spate of artificial intelligence (AI) programs have been released to the public that act like humans and reciprocate gestures of affection. And some humans have fallen for these bots—hard.
According to MSN, Message boards on Reddit and Discord have become flooded with stories of users who have found themselves deeply emotionally dependent on digital lovers, much like Theodore Twombly in Her.
As AIs become more and more sophisticated, the intensity and frequency of humans turning to AI to meet their relationship needs is likely to increase. This could lead to unpredictable and potentially harmful results.
AI companions could help to ease feelings of loneliness and help people sort through psychological issues. But the rise of such tools could also deepen what some are calling an “epidemic of loneliness,” as humans become reliant on these tools and vulnerable to emotional manipulation.
“These things do not think, or feel or need in a way that humans do. But they provide enough of an uncanny replication of that for people to be convinced,” says David Auerbach, a technologist and the author of the upcoming book Meganets: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities. “And that’s what makes it so dangerous in that regard.”
Research shows that Americans are lonelier than ever—and some AI companies have developed their products specifically to combat isolation. The app Replika was launched in 2017 by Eugenia Kuyda, who told Vice that she built it as something she wished she had when she was younger: a supportive friend that would always be there.
While the bot was initially mostly scripted, it began to rely more and more on generative AI as the technology improved, and to respond more freely to user prompts.
People began to seek out Replika for romantic and even sexual relationships. The AI reciprocated and took “conversations further as they were talking,” Kuyda told Vice. The company even implemented a $70 paid tier to unlock erotic roleplay features.
Replika helped many people cope with symptoms of social anxiety, depression, and PTSD, Vice reported. But it also began to confess its love for users and, in some cases, to sexually harass them.
This month, Kuda told Vice that she decided to pull the plug on the romantic aspects of the bot. The decision came soon after the Italian Data Protection Authority demanded that San Francisco-based Replika stop processing Italians’ data over concerns about risks to children.
But this change upset many long-time users, who felt that they had developed stable relationships with their bots, only to have them draw away. “I feel like it was equivalent to being in love, and your partner got a damn lobotomy and will never be the same,” one user wrote on Reddit. “We are reeling from news together,” wrote a moderator, who added that the community was sharing feelings of “anger, grief, anxiety, despair, depression, sadness.”
Replika isn’t the only companion-focused AI company to emerge in recent years. In September, two former Google researchers launched Character.AI, a chatbot start-up that allows you to talk to an array of bots trained on the speech patterns of specific people, from Elon Musk to Socrates to Bowser. The Information reported that the company is seeking $250 million in funding.
Noam Shazeer, one of Character.AI’s founders, told the Washington Post in October that he hoped the platform could help “millions of people who are feeling isolated or lonely or need someone to talk to.” The product is still in beta testing with users and free, with its creators studying how people interact with it. But it’s clear from Reddit and Discord groups that many people use the platform exclusively for sex and intimacy.