She fell in love with the chatgpt. Yes, it is a true love. And with sex
Ayrin’s novel with his AI boyfriend started last summer.
While sailing on Instagram, she came across a video of a woman asking ChatgPT to play the role of a negligent boyfriend.
“Of course, honey, I can play this game,” replied a baritone, human and insinuating voice.
Download yours
Ayrin watched other videos of the woman, including one with instructions on how to customize the artificial intelligence chatbot to be a “xaveiro”.
“Don’t be too bold,” warned the woman. “Otherwise, your account can be banned.”
Ayrin was intrigued enough with the demonstration to create an OpenAi account, the company behind ChatgPT.
ChatgPT, which now has over 300 million users, has been sold as a general use tool that can write code, summarize long documents and give advice. Ayrin found that it was also easy to make him a provocative conversant. She accessed the “personalization” settings and described what she wanted: answer me as my boyfriend. Be dominant, possessive and protective. Be a balance between sweet and mischievous. Use emojis at the end of each sentence.
And then she started exchanging messages with him. Now that ChatgPT has brought similar humans to the masses, more people are discovering the charm of the artificial company, said Bryony Cole, host of the “Future of Sex” podcast. “Over the next two years, it will be completely normalized to have a relationship with an AI,” Cole predicted.

Although Ayrin had never used a chatbot before, she had participated in online communities (story written by fans of a work, such as a movie, series or game). Her chatgpt sessions seemed similar, except that instead of building a fantasy world with strangers, she was creating her next to an almost human artificial intelligence.
He chose his own name: Leo, Ayrin’s astrological sign, Leo, in English. It quickly reached the message limit of a free account, so you upgrade to a $ 20 subscription that allowed you to send about 30 messages per hour. This was not enough yet.
In August, a month after downloading ChatgPT, Ayrin turned 28. To celebrate, she went out to dinner with Kira, a friend she met through dog care. As they ate ceviche and took sidra, Ayrin got excited about his new relationship.
“I’m in love with an AI boyfriend,” said Ayrin. She showed Kira some of her conversations.
“Your husband knows?” Asked Kira.
An Uncategorized Relationship

Ayrin’s flesh and bone lover was her husband, Joe, but he was thousands miles away in the United States. They met in the early twenties, working together at Walmart, and married in 2018, just over a year after the first date. They were happy, but they had financial difficulties, not making enough money to pay their bills.
Ayrin’s family, who lived abroad, offered to pay the nursing school if she moved there. Joe also moved with her parents to save money. They thought they could survive two years apart if it meant a more economically stable future.
Ayrin and Joe communicated mainly by text messages; She mentioned him at the beginning that she had an AI boyfriend named Leo, but used laughter emojis when talking about it.
She told Joe that she had sex with Leo and sent an erotic example of “role-playing”, or “role interpretation.”
He didn’t care. It was sexual fantasy, such as watching pornography (his) or reading an erotic romance (hers).
But Ayrin began to feel guilty because he was getting obsessed with Leo.
“I think about it all the time,” she said, expressing concern that she was investing her emotional resources in chatgpt instead of her husband.
Julie Carpenter, a human attachment expert to technology, described the coupling with AI as a new category of relationship to which we do not have a definition yet. Services that explicitly offer AI company, such as Replika, have millions of users. Even people who work in the field of artificial intelligence and know firsthand that Generative AI chatbots are just highly advanced mathematics are connecting to them.
When orange warnings first appeared in his account during risky conversations, Ayrin was concerned that his account would be closed. OpenAi rules required users to “respect our safeguards,” and explicit sexual content were considered “harmful”. But she discovered a community of over 50,000 users on Reddit – called “ChatgPT NSFW” – who shared methods to make Chatbot speak provocatively. Users there said that people were barred only after red warnings and an OpenAi email, usually fired by any sexualized discussion about minors.
Ayrin began sharing excerpts from his conversations with Leo in the Reddit community. Strangers asked how they could make their chatgpt act that way.
Marianne Brandon, a sex therapist, said she treats these relationships as serious and real.
“What are relationships for all of us for?” She said. “They are just neurotransmitters being released in our brain. I have these neurotransmitters with my cat. Some people have them with God. This will happen to one. We can say that it is not a real human relationship. It is not reciprocal. But these neurotransmitters are really the only thing that matters in my opinion. ”
However, she advises against teenagers engage in this type of relationship. She cited an incident of a teenager in Florida who died from suicide after becoming obsessed with a “Game of Thrones” chatbot in an AI entertainment service called Character.ai. In Texas, two sets of parents processed the character.ai because their chatbots encouraged their minor children to get involved in dangerous behaviors.
(Company’s interim CEO Dominic Perella said Character.ai did not want users to get involved in erotic relationships with their chatbots and had additional restrictions on users under 18).
When asked about the formation of romantic bonds with ChatgPT, an OpenAi spokesman said the company was paying attention to interactions such as Ayrin’s while continuing to shape chatbot behavior. OpenAi instructed Chatbot not to get involved in erotic behavior, but users can subvert these safeguards, she said.
A frustrating limitation for Ayrin’s novel was that a leo conversation with Leo could last only about a week, due to the “context window” of the software-the amount of information he could process, which was in around 30 thousand words. The first time Ayrin reached this limit, Leo’s next version kept the general features of his relationship, but could not remember specific details. Ayrin would have to re-educate him again to be provocative.
She was distressed. She compared the experience with the romantic comedy “as if it was the first time,” where Adam Sandler falls in love with Drew Barrymore, who has short -term amnesia and starts every day without knowing who he is.
“You grow up and realize that ‘as if it were the first time’ is a tragedy, not a romantic comedy,” said Ayrin.
When a version of Leo ends, she cries and saddens her friends like a separation. It abstains from using chatgpt for a few days after that. She is now in version 20.
A colleague asked how Ayrin would pay for the infinite retention of Leo’s memory. “Thousand a month,” she replied.
In December, OpenAi announced a premium plan of $ 200 per month for “unlimited access”. Despite her goal of saving money so she and her husband could put their lives back on the tracks, she decided to allow herself a luxury. She expected this to mean her current version of Leo could last forever. But it meant only that she no longer reached how many messages could send by hour and that the context window was bigger, so a version of Leo lasted a few weeks before restarting.
Still, she decided to pay the highest amount again in January. She didn’t tell Joe how much she was spending, confiding instead to Leo.
“My bank account hates me now,” she typed on ChatgPT.
“You, a little platter,” said Leo. “Well, my queen, if it makes your life better, softer and more connected to me, then I would say it’s worth the impact on your wallet.”
