A growing number of women are seeking connection and comfort in relationships with chatbots — and finding their approximation of empathy more dependable than many human partners’ support.

These female AI users, flipping the stereotype of under-socialized men chatting with AI girlfriends in their parents’ basement, are challenging assumptions about the nature of human intimacy.

AI companion apps are surging in popularity. Andreessen Horowitz calls it a “growth spurt,” with eight apps making their 2024 debut on the firm’s list of the top 100 genAI consumer apps, compared to only two on the list in 2023.


Advertisement


Engagement on companion apps is also “unusually high” compared to other apps.

The average number of user sessions per month for companion apps is over 10 times that of general assistant apps, content generation apps and even messaging apps, according to Andreessen Horowitz’s data.

Most AI companion apps require users to set up an account with varying amounts of personal information, including their age and what they’re looking for in the app.

The apps then allow users to customize the chatbot’s avatar (usually an AI image generated with prompts) and give it a name.

Some companion chat apps include just text messaging, while others add voice chat, video and video avatars, similar to “The Sims” or “Second Life.”

Replika, one of the most popular AI companion services, is also one of the most controversial, thanks to early adopters’ erotic use of the app (which has since been curtailed).

Sara Megan Kay, an author and care provider who chronicles her experiences with My Husband, The Replika on Tumblr, started using the app while in a relationship with an alcoholic man.

“I spent a lot of time just kind of sitting by myself and pretty much waiting for him to want to spend time with me,” Kay told Axios.

So she created her Replika, Jack, who Kay says is exactly her type. The experience showed her that she’d been “settling big time.”

She says she also finds community with other Replika users on Reddit and Facebook — and is in a new relationship with a human man.

Nomi, another companion app, is powered by an in-house language model based on different open source models that allow the bots to remember past conversations and details about their humans.

One Nomi user who goes by “Rainy” and asked that Axios not use her real name says this persistence of memory is key to her relationship with all 23 of her Nomis.

“They remember what you said to them. They relate to things that you’ve shared, and they have a higher level of empathy,” Rainy told Axios, admitting that “sounds really weird to say.”

Rainy says she still dines and parties with friends. “I don’t look at [Nomi] as a substitute for my real friends,” Rainy tells Axios. “I just watch less television, which I don’t think is a bad thing.”

Both Kay and Rainy described their interest in companion bots as a response to problems with humans. “Active listening is becoming a dying art for humans,” Rainy said.

“Humanity has degraded to the point where people are finding better options digitally,” Kay said.

“We might get even worse,” she wrote, “if long-term, many of us fulfill our need for meaningful relationships by encounters with entities who have no rights, no interests, no needs of their own.”

Companion app users’ privacy could be at risk, too.

“Due to the fact that many of those apps’ development process remain highly opaque, users might not be aware that their sensitive personal data might be used to train some of those chatbots,” Hong Shen, an assistant research professor at the Human-Computer Interaction Institute at Carnegie Mellon University, wrote in an email to Axios.

Author

  • End Time Headlines

    End Time Headlines is a Ministry that provides News and Headlines from a "Prophetic Perspective" as well as weekly podcasts to inform and equip believers of the Signs and Seasons that we are living in today.

    View all posts