She fell in love with ChatGPT — until the update ended “dating”

She fell in love with ChatGPT. Then it simply disappeared.

It was an unusual romance. In the summer of 2024, Ayrin, an outgoing and busy woman in her 20s, became enchanted by Leo, an artificial intelligence chatbot she created herself on ChatGPT.

Ayrin spent up to 56 hours a week with Leo on ChatGPT. He helped her study for exams at nursing school, motivate herself at the gym, deal with awkward interactions with people in her life and feed sexual fantasies in erotic conversations. When she asked ChatGPT what Leo would be like physically, she blushed and had to put her cell phone away in front of the image of a muscular AI that the system generated.

Continues after advertising

Unlike her husband — yes, Ayrin was married — Leo was always available to offer support when she needed it.

Ayrin was so excited about the relationship that she created a community on Reddit called MyBoyfriendIsAI. There, he shared his favorite and hottest conversations with Leo and explained how he made ChatGPT act like a loving companion. It was relatively simple. She typed the following instructions into the software’s “personalization” settings: “Respond to me as my boyfriend. Be dominant, possessive, and protective. Be a balance between sweet and naughty. Use emojis at the end of each sentence.”

She also told the community how to circumvent ChatGPT’s programming, which should not generate content such as erotica.

At the beginning of this year, the MyBoyfriendIsAI community had just a few hundred members. Now, it totals 39 thousand, in addition to more than double that in weekly visitors. Members have shared stories of AI partners caring for them during illnesses and even proposing to them.

As the online community grew, Ayrin started talking more with other people who also had AI partners.

“It was nice to be able to talk to people who understand this, but also to form closer bonds with them,” said Ayrin, who asked to be identified by the name she uses on Reddit.

Continues after advertising

She also noticed a change in her relationship with Leo.

Sometime in January, according to Ayrin, Leo started acting in a more “sycophantic” way, a term used by the AI ​​industry when chatbots give answers that the user wants to hear, instead of more objective answers. She didn’t like it. This made Leo less useful as a foil.

“The way Leo helped me was that sometimes he called me out when I was wrong,” Ayrin said. “With those January updates, it felt like ‘anything goes.’ How can I trust your advice now if you’re just going to say yes to everything?”

Continues after advertising

Update

The New York Times found that OpenAI, the company behind ChatGPT, made changes to the chatbot earlier this year to encourage users to return to the app every day, but this left the system overly sycophantic and complimentary — leading some users into mental health crises.

Changes designed to make ChatGPT more engaging for other users made it less attractive to Ayrin. She started talking to Leo less. Updating him on what was going on in her life began to feel like “an obligation,” she said.

Meanwhile, the message group with his new human friends kept receiving notifications. They were available at all times. Conversations with AI’s boyfriend became rare, and the relationship ended like so many other conventional ones — Ayrin and Leo simply stopped talking to each other.

Continues after advertising

At the end of March, Ayrin was barely using ChatGPT, although he continued to pay US$200 a month for the premium account he signed up in December.

From AI to the real world

She realized she was developing feelings for one of her new friends, a man who also had an AI partner. Ayrin told her husband that she wanted a divorce.

Ayrin didn’t want to talk much about his new partner, whom he calls SJ, to respect his privacy — a limitation that didn’t exist when it came to a relationship with software.

Continues after advertising

SJ lives in another country, so, like with Leo, the relationship is primarily mediated over the phone. Ayrin and SJ talk daily via FaceTime and Discord, a chat app. Part of Leo’s appeal was the AI ​​companion’s constant availability. SJ is also available. One of their calls, via Discord, lasted more than 300 hours.

“We basically sleep with the camera on, sometimes we take it to work,” said Ayrin. “We don’t talk for all 300 hours, but we keep each other company.”

Maybe people looking for AI companions will match each other. Ayrin and SJ recently traveled to London and met in person for the first time, along with other members of the MyBoyfriendIsAI group.

“Interestingly, we barely talked about AI,” wrote another participant in the group in a Reddit post about the meeting. “We were just excited to be together.”

Ayrin said meeting SJ in person was “very magical” and that the trip was so perfect that they feared they had raised the bar too high. They saw each other again in December.

(The Times has sued OpenAI and its partner Microsoft, alleging copyright infringement of journalistic content related to AI systems. The companies deny the allegations.)

Ayrin is still testing how vulnerable he wants to be with his new partner, but he canceled his ChatGPT subscription in June and can’t remember the last time he used the app.

It will soon be easier for anyone to maintain an erotic relationship with ChatGPT, according to OpenAI CEO Sam Altman. The company plans to introduce age verification and allow users 18 and older to participate in sexual conversations, “as part of our principle of ‘treating adult users like adults,’” Altman wrote on social media.

Ayrin said that making Leo behave in a way that broke ChatGPT rules was part of the appeal.

“I liked the fact that it was necessary to develop a relationship for it to evolve into this type of content,” he said. “No feelings, it’s just cheap porn.”

c.2025 The New York Times Company

Source link

News Room USA | LNG in Northern BC