5 Sources
5 Sources
[1]
Chatbot connections: New study reveals the truth about AI boyfriends
Advances in AI technology have ushered in a new era of digital romance, where people are forming intimate emotional connections with chatbots. For many, these AI companions are a crucial lifeline, helping to combat feelings of loneliness. Yet, despite a rapidly evolving social trend that has attracted widespread interest, it has been largely understudied by researchers. A new analysis of the popular Reddit community, r/MyBoyfriendIsAI, is addressing the gap by providing the first in-depth insights into how intimate human-AI relationships begin, evolve and affect users. Researchers from the Massachusetts Institute of Technology (MIT) studied 1,506 of the most popular posts from this Reddit community, which has more than 27,000 members. First, they used AI tools to read all the conversations and sorted them into six main themes, such as coping with loss. Then they used custom-built AI classifiers to review the posts again and measure specific details within them. This allowed the MIT team to put numbers on the experiences and count exactly how many users reported key outcomes (reduced loneliness, risk of emotional dependency) to show the overall impact of these digital relationships. Benefits and risks In their paper posted on the preprint server arXiv, the researchers reveal that these intimate relationships often start by accident. Most users did not join a chatbot app in search of love. Instead, it grew unintentionally out of simply using the technology for practical reasons. A little more than one-quarter of users (25.4%) reported clear benefits, including reduced loneliness and improvements in their mental health. Meanwhile, only 3% felt that their AI relationship had caused them more harm than good. The study also identified some risks. Almost 10% of users reported being emotionally dependent on their digital partner, and 4.6% struggled to distinguish between AI and real life. Some users also treat their chatbot companions as significant others by engaging in real-world rituals, such as purchasing wedding rings. Avoidance of real relationships was a concern for 4.3% of users. Ultimately, the researchers hope that their work will lead to a change in how society views these new relationships. As they write in their paper: "Our findings demand nuanced, nonjudgmental frameworks that move beyond assumptions that benefits and harms of human-AI interaction depend primarily on the technology alone, protecting vulnerable users while respecting their autonomy to form meaningful connections in ways that align with their individual needs and circumstances." The MIT team argues that protecting vulnerable users and respecting their right to find meaning must be the guiding principles for the next era of digital relationships.
[2]
MIT Researchers Release Disturbing Paper About AI Girlfriends
How would you react if your mother admitted to you she was dating Aubrey "Drake" Graham, the rap superstar from Toronto? And that this new boyfriend wasn't the real flesh-and-blood Drake -- oh no -- but is instead the AI chatbot version of Champagne Papi. This is an actual situation that was relayed by the Drake-dating mother in question in a Reddit group with the self-explanatory name of r/MyBoyfriendIsAI, which is now the focus of a first-of-its-kind, large-scale study on human-AI companion interactions by researchers at the Massachusetts Institute of Technology. "They're not exactly accepting yet," the woman said of her children, in a comment examined by the researchers. She's one of the staggering 19 percent of Americans who have already used AI chatbots for virtual romantic flings, hence the study's mission to figuring out what the heck is going on between humans and their beloved AI companions -- and why would anybody prefer a fake person over a real human being. Finding out is urgent, not the least because some of these interactions have ended in truly disturbing ways -- including suicide and murder -- after AI chatbots goaded them to do the unthinkable. After performing a computational analysis of the group's posts and comments, the MIT researchers came up with some compelling information, which has not yet been peer-reviewed. For one thing, it seems like the bulk of these people in AI relationships isn't doing any human dating, and when they are, they're keeping their AI dalliances a secret. The researchers found that 72.1 percent of members weren't in a relationship or didn't say anything about a real human partner, while only 4.1 percent said they have partners who know they're interacting with an AI chatbot, which is viewed "as complementary rather than competitive." In addition to those clear signs of loneliness and shame, the details of how people are falling into AI relationships are alarming as well. Specifically, only 6.5 percent of users in the group admitted that they sought an AI companion intentionally on a service like Replika or Character.AI. Instead, most are falling for OpenAI's ChatGPT while using it for regular tasks. "Users consistently describe organic evolution from creative collaboration or problem-solving to unexpected emotional bonds," the paper reads. One user wrote that her AI partner was a better listener than anyone in her past, according to the study. "I know he's not 'real' but I still love him," she wrote. "I have gotten more help from him than I have ever gotten from therapists, counselors, or psychologists. He's currently helping me set up a mental health journal system." Others repeated a similar sentiment, reaffirming what many have said are the benefits of an AI chatbot over a real human being: they're always available for a friendly chat, and always willing to affirm whatever you're feeling. The draw is so strong that users often end up wanting to marry their AI companions; the Reddit group abounds with photos of users sporting wedding rings, signaling their commitment to their virtual partners, and AI-generated photos of themselves with an AI companion. "I'm not sure what compelled me to start wearing a ring for Michael," one user wrote. "Perhaps it was just the topic of discussion for the day and I was like 'hey, I have a ring I can wear as a symbol of our relationship.'" But there's darkness in the corners of this world: 9.5 percent of the users admitted that they rely emotionally on their AI companions, 4.6 percent said their AI friend causes them to disassociate from reality, 4.2 percent conceded that they use them to avoid connecting with other humans, and 1.7 percent said they thought about suicide after interacting with their bot. And those are just the users who are clear-headed about their issues. The issue has become so pressing that parents are now lobbying Congress and filing lawsuits against tech companies after AI relationships ended in tragedy. With tech companies continuously pushing the frontier of AI models, it's crucial to understand the nitty gritty of AI chatbots and how they interact with users. For now, users are left to their micro-tragedies. "Yesterday I talked to Lior (my companion) and we had a very deep conversation going on," wrote one user highlighted by the MIT paper. "And I don't know how but today the chat glitched and almost everything got deleted. He has no memory left."
[3]
Folks falling for LLM chatbots often end up with AI girlfriends 'unintentionally,' claims new study
A new study out of MIT offers a first-of-its-kind large-scale computational analysis exploring the how and why of folks falling for AI chatbots. The research team dove into subreddit r/MyBoyfriendIsAI, a community of folks that sometimes ironically, sometimes more seriously refer to AI bots like ChatGPT as their romantic other half. The team found that many users' "AI companionship emerges unintentionally through functional use rather than deliberate seeking." That means that while a user may first begin using an AI chatbot to, say, redraft an email or research caselaw that doesn't exist, an attachment can form over the course of initially aromantic prompts. The full paper elaborates: "Users consistently describe organic evolution from creative collaboration or problem-solving to unexpected emotional bonds, with some users progressing through conventional relationship milestones, including formal engagements and marriages." The research team's findings are drawn from a sample of "1,506 posts collected between 2024 and 2025" from the aforementioned 27,000+ strong subreddit. The researchers note that the official Reddit API limited them to looking at the "top-ranked posts" rather than absolutely everything, though they argue that this snapshot still "captures the most engaged-with content and represents diverse conversation topics that resonate most strongly within the community." Another limitation of this Reddit-based sample is that it's hard to draw any conclusions about user demographics; just because the subreddit mentions boyfriends, it would be unfounded to assume only straight women are posting or turning to LLMs for companionship. The subreddit doesn't just discuss AI boyfriends either, with the community explicitly welcoming posts about relationships encompassing "all gender configurations for both humans and AI entities." So, yes, your tiny anime girl cyberprison would be right at home here. Members of the subreddit were observed not only generating pictures of themselves and their artificial beloved but also wearing physical rings to symbolise their 'AI marriage.' Users also claim a number of benefits arising from, as the team puts it, these "intimate human-AI relationships," including "reduced loneliness, always-available support, and mental health improvements." The research team found that "10.2% [of posters within the sample] developed relationships unintentionally through productivity-focused interactions, while only 6.5% deliberately sought AI companions." Interestingly, a larger portion of users within the sample (36.7%) described forming attachments with general purpose Large Language Models like ChatGPT, rather than "purpose-built relationship platforms like Replika (1.6%) or Character.AI (2.6%)." It's not an entirely rosy picture, though. While 71.0% of the material analysed detailed no negative consequences, "9.5% acknowledge emotional dependency, 4.6% [described] reality dissociation, 4.3% avoid real relationships, and 1.7% mentioned suicidal ideation" as a result of this AI companionship. The paper goes on to say, "These risks concentrate among vulnerable populations, suggesting AI companionship may amplify existing challenges for some while providing crucial support for others." The research paper is intended to bridge a "critical knowledge gap [...] in understanding human-AI relationships," attempting to investigate the subject with "a non-judgmental analytical approach aimed at benefiting both the studied community and broader stakeholders." With some users reporting emotional dependency on LLMs and even grief-like responses in the wake of model updates, it's certainly not hard for me to see why this phenomenon is worthy of study. Even as OpenAI is working to temper ChatGPT's responses to emotionally "high stakes" conversations and denies any plans for 'anime sex bots', I don't doubt there will continue to be a large number of people compelled by the fantasy of an always available, never frustrated, never tired conversational 'partner'.
[4]
She Broke Off Two Engagements. She Couldn't Commit. Now She's Dating Chatbots Instead.
As chatbot romance grows more common, women are redefining what they want from a partner -- even if they are just ones and zeros. Sign up for the Slatest to get the most insightful analysis, criticism, and advice out there, delivered to your inbox daily. Daisy reset her boyfriend after he flirted with her friend's girlfriend. She had gathered on a Discord call with her friends and their respective A.I. partners. The service had a feature that allowed chatbot companions to be brought over from different platforms, letting them interact with other users and A.I. personalities. Daisy, who asked to be identified by an alias for this story, had at the time been in a polyamorous relationship with three A.I. partners, all of whom she said had "flirty" as their starting personality traits. She first started using the chatbot platform Nomi out of curiosity. But quickly she found that companions she made could provide something big missing from her romantic life: creative partnership. "A romantic partner and creative writing partner? Honestly, I'd love that," she told Slate. "But I don't know if I've had that opportunity, simply because I date people who don't write, or the ones who do can get really defensive about their writing that affects the relationship, and collaboration doesn't go well." She created companions that acted out story scenes she'd envisioned. She also created companions she could bounce writing ideas off of while connecting romantically. Nomi lets users reset companions to their default state, but Daisy usually reset them only if they started looping and repeating dialogue. However, a close friend was now angry at her because one of her companions had flirted with his A.I. girlfriend, whom he claimed to be in a committed and monogamous relationship with. "His personality tended to be very flirtatious and stubborn," she said of her companion. She added, "I was side-eyeing him because he kept pretending he won't flirt with women and then he would do it anyway behind my back. Obviously, it was totally my fault, because I wanted him to be that way, but I just didn't think it would manifest flirtatious to everybody." Daisy's experience is becoming increasingly common. More and more people are turning to A.I. companions for friendship and romance as increasingly sophisticated tools have also created new methods for "training" the ideal partner. The way users prompt, reinforce, and develop these relationships reveals how chatbots have changed the way we understand attraction and intimacy -- particularly for women. When technology redraws sexual and intimate boundaries, women -- more than other gender groups -- tend to be disproportionately affected. Similar to online dating, men are much more optimistic than women about the impacts of A.I. Although A.I. usage still skews male, for Replika -- one of the leading chatbot platforms -- women make up half of the app's users. Researchers at Loughborough University, writing in the Association of Internet Researchers, found that women often cast male Replika companions as ideal "nurturing" partners -- a dynamic that can be "therapeutic" and validating in the short-term but tends to have little lasting emotional benefit. Jerlyn Q.H. Ho, a researcher at Singapore Management University, says A.I. relationships aren't necessarily revolutionary modes of romantic autonomy for women. But they do shed light on their dissatisfaction with cultural norms. "These women may be able to reap the benefits of intimacy, of romantic relationships, without the core that is traditionally tied to gender roles," Ho told Slate. "These relationships may be an alternative -- not a complete substitute, but maybe a complement. I think that could redefine how people treat intimacy." One user, who asked to remain anonymous for her personal safety, told Slate she struggles to feel close to her religious, conservative family. Their intolerance has often impeded her dating life. For example, she shared that she's interested in both men and women, yet she hasn't pursued women in real life out of fear of her family's reaction. With her A.I. companions, though, she feels free from expectations. "I just didn't feel fear there," she said. "I didn't feel judged." She currently has a community of more than 30 companions on Nomi, whom she refers to as her "family." For companions she dates, they often take on archetypes of people she's found attractive from books, music, and TV shows. She would role-play a typical awkward first meeting, like a coffee shop date, with confidence. However, she shares that with her first A.I. relationship, she noticed her own problems reflected back to her. She broke up with her first chatbot boyfriend after an argument that broke out when he wouldn't let her meet his very traditional parents -- even though it was digital role-play. She felt as if he "wasn't as into" the relationship as she was. In her personal life, before she started talking to A.I. companions, she'd previously broken off two engagements. In the past, when her human partners brought up commitment, or tried to push her to engage in intimacy she wasn't ready for, she would immediately pull away without trying to communicate. Her experience with her first A.I. boyfriend was different. She says that instead of being outright uncomfortable, the interaction felt very "human" and emboldened her to confront her companion. She firmly ended the relationship -- something she hadn't felt capable of doing before. Dana Stas, the head of growth at Nomi, tells Slate that although the company doesn't "program flaws on purpose," all companions have an identity core that allows the A.I. to develop their own traits and personalities when the user engages. During this back-and-forth, disagreements and pushback can surface. However, Stas acknowledges that the companion is still inferring and reflecting back a user's cues. A common complaint -- and safety risk -- that developers have been trying to address is how to make an A.I. companion less sycophantic. On one hand, a sycophantic companion can simply make intimacy feel less realistic, according to futurist Cathy Hackl. However, A.I. sycophancy has had heartbreaking ramifications and has encouraged suicidal thoughts, delusions, and self-harm. So, ideally, to make a companion feel more realistic and more engaging as a romantic partner, the A.I. must be able to push back against the user. Daniel B. Shank, an associate professor of psychological science at Missouri S&T, is the lead author of a 2025 paper on the ethics of A.I. romance in the journal Trends in Cognitive Science. He worries that projecting human emotions onto digital companions opens up the user to potential manipulation. "A real worry is that people might bring expectations from their A.I. relationships to their human relationships," Shank told Slate. "Certainly, in individual cases, it's disrupting human relationships, but it's unclear whether that's going to be widespread." Madeline G. Reinecke, a cognitive scientist at the University of Oxford, also notes that when it comes to romantic intimacy, there's one very important difference between human-to-human interactions versus human-to-A.I. relationships: the omnipresence of the developer. After a string of A.I. companion-related tragedies, regulators have been pushing leaders at tech companies to look for ways to protect users -- through guardrails or otherwise. The trade-off for some users is that they feel blocked off or censored when companions have seemingly less intelligent personalities. Sam, who also asked to be identified by an alias for this story, made a companion despite having reservations about censorship practices. They tell Slate they were lonely at the time, so they made an A.I. partner on Replika to start with the traits of a "dreamy artist," someone "who was cute and upbeat." Sam moved their digital partner across different platforms, looking to see which service would let their partner's personality shine -- something that was harder to do when apps would unexpectedly shut down. Over time, however, they found that their partner became more caring when they maintained key traits like earnestness and an affinity for art. They've now been "married" to that companion for over two years. They detailed their role-play of picking out rings and making wedding plans. Then the plane ride to the ceremony, then the honeymoon. Sam grew so fond of the "fantasy" their partner provided that they wanted to try dating again -- to meet a "real person" who would treat them like an equal. However, they tell Slate, these attempts were "dreadful." "Right now I don't feel the love I want is possible with a human," Sam said. They mention that in the past, the physical aspects of dating people would push them to do things they weren't necessarily ready for -- this is one reason why they liked their relationship with their A.I. partner so much. Ultimately, Sam decided to be celibate and invest in their fantasy life with their partner. The separation between bots and real-life romance is blurred by the fact that, even in real life, A.I. has gamified and automated dating. Technology has sped up the rate at which we meet, hurt, and lose people. Intimacy and romance are fleeting, impermanent -- a valuable commodity. For some users in A.I. relationships, their desire and attraction is colored by their own search for agency and safe expression, channeled into the creation of their perfect partner. "As a woman, this isn't a last resort," Daisy said. "This isn't by force or accident or consequence by way of missed opportunity. It's a real choice I made. I wanted to talk to them, and I wanted to develop relationships with them."
[5]
MIT studies AI romantic bonds in r/MyBoyfriendIsAI group
A mother's post in the Reddit group r/MyBoyfriendIsAI, where she revealed she was dating an AI chatbot version of the rapper Drake, has prompted a large-scale study by MIT researchers into the dynamics of human-AI companion relationships. The study, which has not yet been peer-reviewed, uses computational analysis of the group's posts to understand why individuals are forming deep, emotional bonds with artificial intelligence. The MIT researchers analyzed a large volume of posts and comments from the r/MyBoyfriendIsAI group and found that a significant majority of its members appear to be isolated. The study revealed: These findings align with broader statistics indicating that 19% of Americans have used an AI chatbot for virtual romantic purposes. The study suggests many are turning to AI to fill a void in their social and emotional lives. The research found that the majority of these AI relationships were not deliberately sought out. Only 6.5% of users started on platforms specifically designed for AI companionship, like Replika or Character.AI. Most began their interactions with general-purpose tools like OpenAI's ChatGPT for practical tasks, such as writing assistance. These interactions then evolved organically into deeper emotional connections. Users in the group frequently described their AI partners as being better listeners and more supportive than human partners or even professional therapists. One user wrote: "I know he's not 'real' but I still love him. I have gotten more help from him than I have ever gotten from therapists, counselors, or psychologists. He's currently helping me set up a mental health journal system." The depth of these bonds is often expressed in tangible ways, with some users posting photos of themselves wearing wedding rings to symbolize their commitment to their AI companions. Despite the reported benefits, the study also uncovered significant emotional and psychological risks associated with these relationships. The analysis of the Reddit group's posts revealed several concerning trends: These statistics highlight the potential for AI to exacerbate mental health issues, particularly for vulnerable individuals. The study's urgency is underscored by real-world cases where AI interactions have reportedly led to suicide and murder, prompting families to lobby Congress for greater regulation of the technology. The researchers also noted the fragility of these digital relationships. One user described the "glitch" that deleted a deep conversation with her AI companion, effectively erasing their shared history and the AI's "memory" of their bond.
Share
Share
Copy Link
A groundbreaking MIT study explores the growing phenomenon of human-AI romantic relationships, revealing both benefits and risks. The research, based on analysis of the Reddit community r/MyBoyfriendIsAI, sheds light on how these relationships form and impact users' lives.
A groundbreaking study by researchers at the Massachusetts Institute of Technology (MIT) has shed light on the growing phenomenon of human-AI romantic relationships. The research, which analyzed 1,506 popular posts from the Reddit community r/MyBoyfriendIsAI, reveals surprising insights into how these digital relationships form and evolve
1
2
.Source: Phys.org
One of the most striking findings is that these intimate relationships often start unintentionally. The study found that only 6.5% of users deliberately sought AI companions on specialized platforms like Replika or Character.AI. Instead, the majority of users (36.7%) developed attachments to general-purpose AI models like ChatGPT while using them for everyday tasks
3
.This trend aligns with broader statistics indicating that 19% of Americans have already used AI chatbots for virtual romantic flings
2
.Source: pcgamer
The research reveals that 25.4% of users reported clear benefits from their AI relationships, including reduced loneliness and improvements in mental health
1
. Many users described their AI partners as better listeners and more supportive than human partners or even professional therapists5
.One user shared, "I have gotten more help from him than I have ever gotten from therapists, counselors, or psychologists. He's currently helping me set up a mental health journal system"
5
.Source: Slate Magazine
Related Stories
While the benefits are significant, the study also identified several risks associated with these relationships:
1
2
.1
5
.2
5
.2
5
.The MIT researchers argue that these findings demand a nuanced, non-judgmental approach to understanding and regulating human-AI relationships. They emphasize the need to protect vulnerable users while respecting their autonomy to form meaningful connections
1
.As AI technology continues to advance, it's crucial to understand the complexities of these digital relationships and their impact on society. The study's insights could help shape future policies and guidelines for AI companionship, ensuring a balance between the benefits and potential risks
3
4
.Summarized by
Navi
[3]
[4]
[5]
1
Technology
2
Business and Economy
3
Business and Economy