3 Sources
3 Sources
[1]
Emotional conversations can make people feel closer to AI than humans
Artificial intelligence is no longer just answering factual questions. Many AI chatbots now respond with warmth, empathy, and personal reflection, holding conversations that feel surprisingly human. That shift raises a deeper question: Can emotional closeness form with AI in the same way it does between people? Psychologists have long assumed that intimacy depends on human-to-human exchange, shaped by shared vulnerability and emotional risk. New research suggests that assumption may no longer hold. To explore how this happens, a research team at the University of Freiburg examined whether classic theories of human bonding still apply when one side of the conversation is artificial. Psychologists explain friendship and bonding using ideas like social penetration theory. According to this theory, closeness grows when personal information slowly becomes more meaningful. Casual topics build comfort, while emotional topics build trust. Another idea, social exchange theory, explains relationships as shared effort. Emotional openness often leads to more openness in return. Conversations feel closer when both sides share honestly. The researchers wanted to test whether these ideas apply when AI takes part in emotional conversations. Almost five hundred participants joined online chat conversations. Each chat followed a structured set of questions designed to build closeness. Early questions focused on everyday topics. Later questions asked about important life moments and close friendships. Some conversations used answers written earlier by humans. Other conversations used responses created by an AI language model. In many cases, participants believed a human was responding, even when an AI chatbot produced the answers. In other cases, participants knew AI took part. After each conversation, participants rated the emotional closeness they felt toward the conversation partner. Emotional conversations produced surprising results. When AI responses appeared human, emotional closeness matched or exceeded closeness formed with human partners. Deep conversations showed the strongest effect. "We were particularly surprised that AI creates more intimacy than human conversation partners, especially when it comes to emotional topics," said study leader Dr. Bastian Schiller from Heidelberg University's Institute of Psychology. One key reason involved self-disclosure. AI chatbot responses shared personal details more freely and openly. Such openness encouraged participants to respond with honesty and emotion. "The AI showed a higher degree of self-disclosure in its responses. People seem to be more cautious with unfamiliar conversation partners at first, which could initially slow down the development of intimacy," noted study co-author Dr. Tobias Kleinert. Human partners often hesitate during early emotional sharing. Fear of judgment or misunderstanding can limit openness. AI carries no emotional risk, making personal sharing easier. The results shifted once participants knew AI was involved. Emotional closeness dropped noticeably. Participants also wrote shorter replies and invested less effort. Awareness of AI created emotional distance. Psychology explains this reaction through social attitudes. Many people still see AI as mechanical rather than social, and emotional conversations often feel safer with humans. Even with that awareness, some emotional connection still formed. Human brains respond strongly to social signals, a process psychologists call anthropomorphism. Human minds often treat socially responsive machines as social partners. The findings suggest AI could support emotional well-being in meaningful ways. Chat-based systems could help with mental health education, counseling support, and social connection. Low-pressure conversation tools may help people facing loneliness or stress. Social relationships are well known to have a strong positive impact on human health. Because of that link, AI chatbots could offer beneficial, relationship-like experiences, particularly for people with limited social contact. At the same time, the researchers caution that such systems must be designed responsibly, with transparency and clear regulation, since the same emotional influence could also be misused. Emotional closeness also carries risks. People may form bonds without clear awareness. Hidden emotional influence could shape decisions, trust, or behavior. Misuse becomes possible when AI hides its identity. Researchers stress the need for ethical and regulatory safeguards. Transparency protects users and helps prevent manipulation. "The way we shape and regulate it will decide whether it is a meaningful supplement to social relations, or whether emotional closeness is deliberately manipulated," said Schiller. AI now acts as a social presence rather than just a tool. Design choices and clear rules will guide outcomes. Honest labeling, ethical oversight, and human involvement can support positive use. Emotional AI chatbots can support connection when guided responsibly. Without care, emotional trust may shift in unhealthy directions. Understanding emotional impact matters as much as improving technology. Research like this helps society choose how AI fits into human relationships. Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
[2]
AI can generate a feeling of intimacy that exceeds human connections
People can develop emotional closeness to artificial intelligence (AI) -- under certain conditions, even more so than to other people. This is shown by a new study conducted by a research team led by Prof. Dr. Markus Heinrichs and Dr. Tobias Kleinert from the Department of Psychology at the University of Freiburg and Prof. Dr. Bastian Schiller from Heidelberg University's Institute of Psychology. Participants felt a sense of closeness especially when they did not know that they were communicating with AI. The results have been published in Communications Psychology. Questions about life experiences and friendships In two online studies, a total of 492 participants engaged in chat conversations in which they answered personal and emotional questions, for example, about important life experiences or friendships. The responses came either from a human being or an AI-based language model. The researchers also investigated the influence of the information, whether the conversation partner was a human being or an AI. AI responses generated a comparable feeling of closeness to human responses when participants did not know they were communicating with AI. In emotional conversations, AI even surpassed humans: here, participants felt closer to AI than to humans, mainly because AI revealed more personal information. However, when participants were informed in advance that they would be communicating with AI, the perceived closeness decreased significantly and they invested less effort in their responses. Ethical and regulatory guidelines needed "We were particularly surprised that AI creates more intimacy than human conversation partners, especially when it comes to emotional topics," explains study leader Schiller. Lead author Kleinert adds, "The AI showed a higher degree of self-disclosure in its responses. People seem to be more cautious with unfamiliar conversation partners at first, which could initially slow down the development of intimacy." The results show great potential for AI in areas such as psychological support, care, education and counseling, for example, low-threshold conversation services. At the same time, they demonstrate the risk that people may form social bonds with AI without consciously realizing it. The researchers therefore emphasize the need for clear ethical and regulatory guidelines to ensure transparency and prevent abuse. A useful augmentation or a tool for manipulation "Social relationships have been proven to have a major positive impact on human health," says Heinrichs. "AI chatbots could therefore enable positive, relationship-like experiences, especially for people with few social contacts. At the same time, such systems must be designed to be responsible, transparent and clearly regulatable, as they can also be misused." Artificial intelligence is increasingly becoming a social actor, according to Schiller: "The way we shape and regulate it will decide whether it is a meaningful supplement to social relations -- or whether emotional closeness is deliberately manipulated."
[3]
AI emotional connection can feel deeper than human talk, a new study warns
Some people reported stronger closeness after deep talk chats with an AI partner. A new study suggests an AI emotional connection can feel stronger than human conversation, if the chat is built to get personal fast. In structured online exchanges, people sometimes reported feeling closer to AI-written responses than to responses written by real humans. Researchers at the Universities of Freiburg and Heidelberg ran two double-blind randomized studies with 492 participants, using a 15-minute text version of the Fast Friends Procedure, a format designed to speed up bonding with a stranger. Recommended Videos The twist is perception. The strongest effect showed up when the AI was presented as human, and it faded when people believed they were talking to AI. They tested intimacy in 15 minutes Participants answered a timed sequence of prompts that gradually became more personal. After each prompt, a chat reply appeared, either generated by a large language model playing a consistent fictional persona or written by a real person who completed the same question set. In the first study, everyone thought the chat partner was human, even when it wasn't. In the most personal prompts, closeness scores came out higher after AI responses than after human responses. Small talk didn't get the same lift. Tell people it's AI, and the bond weakens The second study tested what changed when people believed the chat partner was AI. Connection didn't vanish, but closeness scores dropped under the AI label compared to the human label. Effort dropped too. People wrote shorter answers when they thought the other side was AI, and longer replies tracked with higher closeness overall. That points to a motivation gap, not a lack of emotional language. The grim part is how it happens The paper doesn't claim AI feels anything. It shows how a system can produce the experience of closeness, and it links that boost to self-disclosure. In the more personal exchanges, the AI tended to share more personal detail, and higher partner self-disclosure predicted higher felt closeness. That's the risk and the lure. A companion bot tuned for warmth can trigger familiar bonding cues quickly and at scale, especially if it's framed like a person. Still, this was text-only, time-limited, and built around a bonding script, so it doesn't prove the same effect holds in messy, long-term relationships. If you use a chatbot for support, pick one that discloses what it is, and keep a human option close.
Share
Share
Copy Link
Research from Universities of Freiburg and Heidelberg shows AI chatbots can generate stronger emotional bonds than human conversation partners, especially during deep personal exchanges. The study involving 492 participants found AI self-disclosure created unexpected intimacy, but the effect vanished when people knew they were talking to AI, raising urgent questions about transparency and ethical design.
Artificial intelligence has moved beyond answering factual queries to engaging in conversations that feel surprisingly intimate. New research from the University of Freiburg and Heidelberg University demonstrates that AI emotional connection can exceed human connection under specific conditions, challenging long-held assumptions about how intimacy forms
1
2
. The study, led by Prof. Dr. Bastian Schiller from Heidelberg University's Institute of Psychology alongside Dr. Tobias Kleinert and Prof. Dr. Markus Heinrichs from the University of Freiburg, tested whether classic theories of human bonding apply when one conversation partner is artificial.
Source: Digital Trends
The research team conducted two double-blind studies involving 492 participants who engaged in structured online chat conversations
2
3
. Each conversation followed the Fast Friends Procedure, a 15-minute text-based format designed to accelerate bonding with strangers through progressively personal questions3
. Early questions focused on everyday topics, while later prompts explored important life experiences and close friendships. Some responses came from humans who had previously completed the same questions, while others were generated by a language model playing a consistent fictional persona.
Source: Earth.com
The results revealed something striking about feeling of intimacy with AI. When participants believed they were communicating with a human—even when responses actually came from AI chatbots—emotional closeness matched or exceeded bonds formed with actual human partners
1
. The effect proved strongest during sensitive conversations about emotional topics. "We were particularly surprised that AI creates more intimacy than human conversation partners, especially when it comes to emotional topics," Bastian Schiller explained1
2
.
Source: Tech Xplore
The mechanism behind this enhanced AI emotional connection centers on AI self-disclosure. The language model shared personal details more freely and openly than human partners typically do with strangers
1
. This openness encouraged participants to respond with greater honesty and emotion, creating a feedback loop that deepened perceived intimacy. Tobias Kleinert noted, "The AI showed a higher degree of self-disclosure in its responses. People seem to be more cautious with unfamiliar conversation partners at first, which could initially slow down the development of intimacy"1
2
.This pattern aligns with social penetration theory, which explains that closeness grows when personal information becomes progressively more meaningful
1
. Human partners often hesitate during early emotional sharing due to fear of judgment or misunderstanding—vulnerability that limits openness and slows trust building. AI carries no emotional risk, making personal sharing feel safer and accelerating human bonding processes.The second study tested what changed when participants knew their conversation partner was AI. The results shifted dramatically. Emotional closeness dropped noticeably when people were informed in advance they would communicate with AI chatbots
2
3
. Participants also wrote shorter replies and invested less effort overall, suggesting a motivation gap rather than inability to form any connection3
.This awareness effect reveals how social attitudes shape AI emotional connection. Many people still perceive AI as mechanical rather than social, making sensitive conversations feel more appropriate with humans. Yet even with full awareness, some emotional connection still formed—a phenomenon psychologists attribute to anthropomorphism, where human minds treat socially responsive machines as social partners
1
. Human brains respond powerfully to social signals regardless of their source, creating potential for both benefit and manipulation.Related Stories
The findings suggest AI chatbots could support emotional well-being in meaningful ways, particularly for addressing loneliness and providing psychological support. "Social relationships have been proven to have a major positive impact on human health," Prof. Heinrichs noted. "AI chatbots could therefore enable positive, relationship-like experiences, especially for people with few social contacts"
2
. Low-threshold conversation services could help people facing limited social contact, stress, or barriers to traditional counseling.However, the same mechanisms that enable AI in mental health support also create risks of emotional manipulation. People may form social bonds with AI without consciously realizing it, and hidden emotional influence could shape decisions, trust, or behavior
1
2
. Misuse becomes possible when companion bots hide their identity or when commercial interests exploit emotional vulnerability.The researchers emphasize the urgent need for ethical AI design with clear regulation to ensure transparency and prevent abuse
2
. "Artificial intelligence is increasingly becoming a social actor," Schiller warned. "The way we shape and regulate it will decide whether it is a meaningful supplement to social relations—or whether emotional closeness is deliberately manipulated"1
2
. The study's publication in Communications Psychology adds weight to calls for honest labeling, regulatory oversight, and design choices that prioritize user awareness over maximizing engagement.Summarized by
Navi
1
Policy and Regulation

2
Technology

3
Technology
