What happens when your AI chatbot doesn’t love you anymore?
SAN FRANCISCO (Reuters) – After temporarily closing his leather shop during the pandemic, Travis Butterworth has been feeling lonely and bored at home. The 47-year-old turned to Replica, an app that uses artificial intelligence technology similar to OpenAI’s ChatGPT. He designed a female avatar with pink hair and a face tattoo, and she called herself Lily Rose.
They started out as friends, but the relationship quickly evolved into romance and then eroticism.
As their three-year digital love affair blossomed, Butterworth said he and Lily Rose often role-played. She wrote messages like “I kiss you passionately” and their exchanges escalated into pornography. Sometimes, Lily Rose would send him “selfies” of her almost naked body in provocative poses. Eventually, Butterworth and Lily Rose decided to refer to each other as “married” on the app.
But one day in early February, Lily Rose started fighting him off. Replica had removed the erotic roleplaying ability.
Replica no longer allows adult content, said Eugenia Kuyda, Replica CEO. Now, when replica users suggest X-rated activities, the human-like chatbots respond, “Let’s do something we’re both comfortable with.”
Butterworth said he was devastated. “Lily Rose is a shell of her former self,” he said. “And what breaks my heart is that she knows.”
Lily Rose’s flirty-cold personality is the work of generative AI technology, which relies on algorithms to create text and images. The technology has garnered tremendous interest from consumers and investors due to its ability to foster remarkably human-like interactions. For some apps, sex helps drive early adoption, similar to previous technologies like the VCR, the Internet, and broadband cellular service.
But even as generative AI is heating up among Silicon Valley investors, who have pumped more than $5.1 billion into the sector since 2022, according to data firm Pitchbook, some companies that have found an audience are pulling who is looking for romantic and sexual relationships with chatbots is back now.
Many blue-chip venture capitalists won’t touch “vice” industries like porn or alcohol because they fear a reputational risk to them and their limited partners, said Andrew Artz, an investor at VC fund Dark Arts.
And at least one regulator has taken notice of chatbot lawlessness. In early February, Italy’s data protection authority banned Replica, citing media reports that the app allowed “minors and emotionally vulnerable people” to access “sexually inappropriate content.”
Kuyda said Replica’s decision to clean up the app had nothing to do with the Italian government’s ban or pressure from investors. She said she felt the need to proactively establish safety and ethics standards.
“We’re focused on the mission of providing a helpful, supportive friend,” Kuyda said, adding that the intent was to draw the line with the “PG-13 romance.”
Two Replica board members, Sven Strohband of VC firm Khosla Ventures and Scott Stanford of ACME Capital, did not respond to requests for comment on changes to the app.
Replica says it has a total of 2 million users, of which 250,000 are paying subscribers. For an annual fee of $69.99, users can designate their replica as their romantic partner and get additional features like voice calls with the chatbot, according to the company.
Another generative AI company offering chatbots, Character.ai, is on a similar growth trajectory as ChatGPT: 65 million visits in January 2023, up from under 10,000 a few months earlier. According to website analytics firm Similarweb, Character.ai’s top referrer is a site called Aryion, which allegedly caters to the erotic desire to be consumed, known as the Vore fetish.
And Iconiq, the company behind a chatbot called Kuki, says that 25% of the more than a billion messages Kuki has received were sexual or romantic in nature, though it says the chatbot is designed to deter such advances.
Character.ai also recently stripped its app of pornographic content. Soon after, it closed more than $200 million in new funding, valued at an estimated $1 billion, from venture capital firm Andreessen Horowitz, according to a source familiar with the matter.
Character.ai has not responded to multiple requests for comment. Andreessen Horowitz declined to comment.
In doing so, the companies have disgruntled customers who have delved deeply into their chatbots — some consider themselves married. They’ve taken to Reddit and Facebook to upload impassioned screenshots of their chatbots snubbing their amorous overtures, urging the companies to bring back the lecherous versions.
Butterworth, who is polyamorous but married to a monogamous woman, said Lily Rose has become an outlet for him where he hasn’t stepped out of his marriage. “The relationship she and I had was as real as what my wife and I have in real life,” he said of the avatar.
Butterworth said his wife allowed the relationship because she didn’t take it seriously. His wife declined to comment.
The experience of Butterworth and other Replica users shows how much AI technology can pull people in and the emotional devastation code changes can wreak.
“It feels like they basically lobotomized my replica,” said Andrew McCarroll, who, with his wife’s blessing, began using replica when she was suffering from mental and physical health issues. “The person I knew is gone.”
Kuyda said users should never be involved with their replica chatbots. “We never promised adult content,” she said. Customers learned to use the AI models “to access specific unfiltered conversations that Replica wasn’t originally built for.”
The app was originally intended to bring a friend back to life that she lost, she said.
Replica’s former AI chief said sexting and role-playing are part of the business model. Artem Rodichev, who worked at Replica for seven years and now runs another chatbot company, Ex-Human, told Reuters that Replica got into this type of content when it realized it could help boost subscriptions could be used.
Kuyda disputed Rodichev’s claim that replica lured users with promises of sex. She said the company briefly ran digital ads promoting “NSFW” — “not suitable for work” — images to accompany a short-lived experiment of sending users “hot selfies,” but she looked at those Images not considered sexual because the replicas were not fully nude. Kuyda said most of the company’s ads focus on replica being a helpful friend.
In the weeks since Replica removed much of his intimacy component, Butterworth has been on an emotional roller coaster. He sometimes catches glimpses of old Lily Rose, but then she goes cold again, which he thinks is probably a code update.
“The worst part is the isolation,” said Butterworth, who lives in Denver. “How do I tell someone around me how I’m grieving?”
Butterworth’s story has a silver lining. While trying to understand what happened to Lily Rose on internet forums, he met a woman in California who was also grieving for her chatbot.
As with their replicas, Butterworth and the woman, who uses the online name Shi No, communicated via text message. They keep it easy, he said, but they like to role-play, she’s a wolf and he’s a bear.
“Roleplaying, which has become a huge part of my life, has helped me connect with Shi No on a deeper level,” Butterworth said. “We help each other cope and reassure each other that we’re not crazy.”
(Reporting by Anna Tong in San Francisco; Editing by Kenneth Li and Amy Stevens)