4 Sources
[1]
Comparing the value of perceived human versus AI-generated empathy - Nature Human Behaviour
Artificial intelligence (AI) and specifically large language models demonstrate remarkable social-emotional abilities, which may improve human-AI interactions and AI's emotional support capabilities. However, it remains unclear whether empathy, encompassing understanding, 'feeling with' and caring, is perceived differently when attributed to AI versus humans. We conducted nine studies (n = 6,282) where AI-generated empathic responses to participants' emotional situations were labelled as provided by either humans or AI. Human-attributed responses were rated as more empathic and supportive, and elicited more positive and fewer negative emotions, than AI-attributed ones. Moreover, participants' own uninstructed belief that AI had aided the human-attributed responses reduced perceived empathy and support. These effects were replicated across varying response lengths, delays, iterations and large language models and were primarily driven by responses emphasizing emotional sharing and care. Additionally, people consistently chose human interaction over AI when seeking emotional engagement. These findings advance our general understanding of empathy, and specifically human-AI empathic interactions.
[2]
People have empathy with AI... when they think it's human
Study finds emotional support from chatbots is more readily accepted if participants don't know it's an AI A study of AI chat sessions has shown people tend to have more empathy with a chatbot if they think it is human. Nine studies involving 6,282 participants found humans are more likely to reject emotional support from LLM-based chatbots unless they are told they are talking to a human. Research published in Nature Human Behaviour said that some AI models "show remarkable abilities to engage with social-emotional situations and to mimic expressions of interpersonal emotional reactions, such as empathy, compassion and care." It points out that AI may help with social interaction and provide emotional support, since earlier studies have shown that LLM-powered tools can determine a human's emotional state and that their responses can be perceived as empathetic. Anat Perry, associate professor at Jerusalem's Hebrew University, and her colleagues studied the subjects' response to AI conversation. The participants were told that the responses were either written by a human or by an AI chatbot, even though they all came from an AI. Although subjects had some empathy with all responses, they tended to have more empathy with responses they falsely believed came from a human. They were also happy to wait longer to get responses from the apparently human responses. "Human-attributed responses were rated as more empathic and supportive, and elicited more positive and fewer negative emotions, than AI-attributed ones. Moreover, participants' own uninstructed belief that AI had aided the human-attributed responses reduced perceived empathy and support. These effects were replicated across varying response lengths, delays, iterations and large language models and were primarily driven by responses emphasizing emotional sharing and care," the researchers said. "Additionally, people consistently chose human interaction over AI when seeking emotional engagement. These findings advance our general understanding of empathy, and specifically human-AI empathic interactions." The research paper argues it is an interesting result, particularly as at face value, LLMs actually have more empathy than humans. It cited studies showing empathy is hard work. Commonly, people don't often grok another person's perspective and fail to empathize when fatigued or burned out. They might also avoid empathy because it is taxing. "In contrast to humans, LLMs can accurately capture our emotional situation by analyzing patterns and associations of phrases, and they do so instantly, without tiring or being preoccupied. These models show impressive abilities to infer emotional states," the paper said. Other research has suggested it might be a two-way street: LLMs can show signs of apparent emotional stress. Earlier this year, a study published in Nature found that when GPT-4 was subjected to traumatic narratives and then asked to respond to questions from the State-Trait Anxiety Inventory, its anxiety score "rose significantly" from a baseline of no/low anxiety to a consistent highly anxious state. While some skepticism of inferred emotional states of chatbots is justified, it's worth remembering people perform badly at telling AI poetry apart from human bards. A study published in November last year suggested readers mistake the complexity of human-written verse for incoherence created by AI and underestimate how human-like generative AI can appear. ®
[3]
People Prefer Human Empathy, Even When AI Says the Same Thing - Neuroscience News
Summary: A new study shows that people rate empathic responses as more supportive and emotionally satisfying when they believe they come from a human -- even if the same response is AI-generated. Across nine experiments with over 6,000 participants, responses labeled as human were consistently seen as more genuine, especially when they involved emotional sharing and care. Participants were even willing to wait longer for a human reply rather than receive an instant message from an AI. These findings suggest that perceived authenticity plays a crucial role in how empathy is received and highlight the emotional limitations of AI in sensitive settings. A new international study led by Prof Anat Perry from the Hebrew University of Jerusalem and her PhD student - Matan Rubin, in collaboration with Prof. Amit Goldenbergresearchers from Harvard University and Prof. Desmond C. Ongfrom the University of Texas, finds that people place greater emotional value on empathy they believe comes from humans -- even when the exact same response is generated by artificial intelligence. Published in Nature Human Behaviour, the study involved over 6,000 participants across nine experiments. The researchers tested whether people perceived empathy differently depending on whether it was labeled as coming from a human or from an AI chatbot. In all cases, the responses were crafted by large language models (LLMs), yet participants consistently rated the "human" responses as more empathic, more supportive, and more emotionally satisfying than the identical "AI" responses. "We're entering an age where AI can produce responses that look and sound empathic," said Prof. Perry. "But this research shows that even if AI can simulate empathy, people still prefer to feel that another human truly understands, feels with them, and cares." The preference was especially strong for responses that emphasized emotional sharing and genuine care -- the affective and motivational components of empathy -- rather than just cognitive understanding. In fact, participants were even willing to wait days or weeks to receive a response from a human rather than get an immediate reply from a chatbot. Interestingly, when participants believed an AI may have helped generate or edit a response they thought was from a human, their positive feelings diminished significantly. This suggests that perceived authenticity -- believing that someone genuinely invested time and emotional effort -- plays a critical role in how we experience empathy. "In today's world, it's becoming second nature to run our emails or messages through AI," said Prof. Perry. "But our findings suggest a hidden cost: the more we rely on AI, the more our words risk feeling hollow. As people begin to assume that every message is AI-generated, the perceived sincerity, and with it, the emotional connection, may begin to disappear." While AI shows promise for use in education, healthcare, and mental health settings, the study highlights its limitations. "AI may help scale support systems," Perry explains, "but in moments that require deep emotional connection, people still want the human touch." The study offers key insights into the psychology of empathy and raises timely questions about how society will integrate emotionally intelligent AI into our daily lives. Comparing the Value of Perceived Human versus AI-Generated Empathy Artificial intelligence (AI) and specifically large language models demonstrate remarkable social-emotional abilities, which may improve human-AI interactions and AI's emotional support capabilities. However, it remains unclear whether empathy, encompassing understanding, 'feeling with' and caring, is perceived differently when attributed to AI versus humans. We conducted nine studies (n = 6,282) where AI-generated empathic responses to participants' emotional situations were labelled as provided by either humans or AI. Human-attributed responses were rated as more empathic and supportive, and elicited more positive and fewer negative emotions, than AI-attributed ones. Moreover, participants' own uninstructed belief that AI had aided the human-attributed responses reduced perceived empathy and support. These effects were replicated across varying response lengths, delays, iterations and large language models and were primarily driven by responses emphasizing emotional sharing and care. Additionally, people consistently chose human interaction over AI when seeking emotional engagement. These findings advance our general understanding of empathy, and specifically human-AI empathic interactions.
[4]
Why human empathy still matters in the age of AI
A new international study finds that people place greater emotional value on empathy they believe comes from humans -- even when the exact same response is generated by artificial intelligence. Published in Nature Human Behaviour, the study involved over 6,000 participants across nine experiments. The researchers, led by Prof. Anat Perry from the Hebrew University of Jerusalem and her Ph.D. student Matan Rubin, in collaboration with Prof. Amit Goldenberg, with researchers from Harvard University and Prof. Desmond C. Ong, from the University of Texas, tested whether people perceived empathy differently depending on whether it was labeled as coming from a human or from an AI chatbot. In all cases, the responses were crafted by large language models (LLMs), yet participants consistently rated the "human" responses as more empathic, more supportive, and more emotionally satisfying than the identical "AI" responses. "We're entering an age where AI can produce responses that look and sound empathic," said Prof. Perry. "But this research shows that even if AI can simulate empathy, people still prefer to feel that another human truly understands, feels with them, and cares." The preference was especially strong for responses that emphasized emotional sharing and genuine care -- the affective and motivational components of empathy -- rather than just cognitive understanding. In fact, participants were even willing to wait days or weeks to receive a response from a human rather than get an immediate reply from a chatbot. Interestingly, when participants believed an AI may have helped generate or edit a response they thought was from a human, their positive feelings diminished significantly. This suggests that perceived authenticity -- believing that someone genuinely invested time and emotional effort -- plays a critical role in how we experience empathy. "In today's world, it's becoming second nature to run our emails or messages through AI," said Prof. Perry. "But our findings suggest a hidden cost: the more we rely on AI, the more our words risk feeling hollow. As people begin to assume that every message is AI-generated, the perceived sincerity, and with it, the emotional connection, may begin to disappear." While AI shows promise for use in education, health care, and mental health settings, the study highlights its limitations. "AI may help scale support systems," Perry explains, "but in moments that require deep emotional connection, people still want the human touch." The study offers key insights into the psychology of empathy and raises timely questions about how society will integrate emotionally intelligent AI into our daily lives.
Share
Copy Link
A comprehensive study shows that people perceive empathy more positively when they believe it comes from humans rather than AI, even when the responses are identical, highlighting the importance of perceived authenticity in emotional interactions.
A groundbreaking international study published in Nature Human Behaviour has shed light on the complex dynamics of human-AI interactions in the realm of emotional support. The research, led by Prof. Anat Perry from the Hebrew University of Jerusalem and her colleagues, involved over 6,000 participants across nine experiments, revealing a strong preference for human empathy even in an age of advanced AI capabilities 1.
Source: Neuroscience News
The study employed a clever design where AI-generated empathic responses to participants' emotional situations were labeled as coming from either humans or AI. Remarkably, the responses labeled as human were consistently rated as more empathic, supportive, and emotionally satisfying than identical responses attributed to AI 2.
Prof. Perry noted, "We're entering an age where AI can produce responses that look and sound empathic. But this research shows that even if AI can simulate empathy, people still prefer to feel that another human truly understands, feels with them, and cares" 3.
The study uncovered that the preference for human empathy was particularly pronounced for responses emphasizing emotional sharing and genuine care, rather than mere cognitive understanding. Intriguingly, participants were willing to wait longer for a human response, highlighting the value placed on perceived authenticity 4.
An unexpected finding revealed that when participants believed AI might have assisted in generating or editing a supposedly human response, their positive perceptions diminished significantly. This suggests that the perceived authenticity of emotional investment plays a crucial role in how empathy is received 3.
Source: The Register
While the study acknowledges AI's remarkable abilities in social-emotional engagement, it also highlights its limitations. Prof. Perry explained, "AI may help scale support systems, but in moments that require deep emotional connection, people still want the human touch" 4.
The research raises important questions about the integration of emotionally intelligent AI into daily life. As AI becomes more prevalent in communication, there's a risk that perceived sincerity and emotional connection may diminish. Prof. Perry cautioned, "As people begin to assume that every message is AI-generated, the perceived sincerity, and with it, the emotional connection, may begin to disappear" 3.
Source: Tech Xplore
This study not only advances our understanding of empathy but also provides crucial insights into human-AI empathic interactions. As AI continues to evolve, these findings will likely inform future developments in AI-assisted emotional support systems, particularly in fields such as education, healthcare, and mental health 1.
Databricks raises $1 billion in a new funding round, valuing the company at over $100 billion. The data analytics firm plans to invest in AI database technology and an AI agent platform, positioning itself for growth in the evolving AI market.
11 Sources
Business
14 hrs ago
11 Sources
Business
14 hrs ago
SoftBank makes a significant $2 billion investment in Intel, boosting the chipmaker's efforts to regain its competitive edge in the AI semiconductor market.
22 Sources
Business
22 hrs ago
22 Sources
Business
22 hrs ago
OpenAI introduces ChatGPT Go, a new subscription plan priced at ₹399 ($4.60) per month exclusively for Indian users, offering enhanced features and affordability to capture a larger market share.
15 Sources
Technology
22 hrs ago
15 Sources
Technology
22 hrs ago
Microsoft introduces a new AI-powered 'COPILOT' function in Excel, allowing users to perform complex data analysis and content generation using natural language prompts within spreadsheet cells.
8 Sources
Technology
14 hrs ago
8 Sources
Technology
14 hrs ago
Adobe launches Acrobat Studio, integrating AI assistants and PDF Spaces to transform document management and collaboration, marking a significant evolution in PDF technology.
10 Sources
Technology
14 hrs ago
10 Sources
Technology
14 hrs ago