Curated by THEOUTPOST
On Sat, 22 Mar, 12:04 AM UTC
14 Sources
[1]
OpenAI has released its first research into how using ChatGPT affects people's emotional wellbeing
"The authors are very clear about what the limitations of these studies are, but it's exciting to see they've done this," Devlin says. "To have access to this level of data is incredible." The researchers found some intriguing differences between how men and women respond to using ChatGPT. After using the chatbot for four weeks, female study participants were slightly less likely to socialize with people than their male counterparts who did the same. Meanwhile, participants who set ChatGPT's voice mode to a gender that was not their own for their interactions reported significantly higher levels of loneliness and more emotional dependency on the chatbot at the end of the experiment. OpenAI currently has no plans to publish either study. Chatbots powered by large language models are still a nascent technology, and it's difficult to study how they affect us emotionally. A lot of existing research in the area -- including some of the new work by OpenAI and MIT -- relies upon self-reported data, which may not always be accurate or reliable. That said, this latest research does chime with what scientists so far have discovered about how emotionally compelling chatbot conversations can be. For example, in 2023 MIT Media Lab researchers found that chatbots tend to mirror the emotional sentiment of a user's messages, suggesting a kind of feedback loop where the happier you act, the happier the AI seems, or on the flipside, if you act sadder, so does the AI. OpenAI and the MIT Media Lab used a two-pronged method. First they collected and analyzed real-world data from close to 40 million interactions with ChatGPT. Then they asked the 4,076 users who'd had those interactions how they made them feel. Next, the Media Lab recruited almost 1,000 people to take part in a four-week trial. This was more in-depth, examining how participants interacted with ChatGPT for a minimum of five minutes each day. At the end of the experiment, participants completed a questionnaire to measure their perceptions of the chatbot, their subjective feelings of loneliness, their levels of social engagement, their emotional dependence on the bot, and their sense of whether their use of the bot was problematic. They found that participants who trusted and "bonded" with ChatGPT more were likelier than others to be lonely, and to rely on it more. This work is an important first step toward greater insight into ChatGPT's impact on us, which could help AI platforms enable safer and healthier interactions, says Jason Phang, an OpenAI policy researcher who worked on the project. "A lot of what we're doing here is preliminary, but we're trying to start the conversation with the field about the kinds of things that we can start to measure, and to start thinking about what the long-term impact on users is," he says. Although the research is welcome, it's still difficult to identify when a human is -- and isn't -- engaging with technology on an emotional level, says Devlin. She says the study participants may have been experiencing emotions that weren't recorded by the researchers. "In terms of what the teams set out to measure, people might not necessarily have been using ChatGPT in an emotional way, but you can't divorce being a human from your interactions [with technology]," she says. "We use these emotion classifiers that we have created to look for certain things -- but what that actually means to someone's life is really hard to extrapolate."
[2]
OpenAI research suggests heavy ChatGPT use might make you feel lonelier
AI chatbots have their benefits, but only time (and more research) will reveal their full impacts on users. Many people are turning to AI chatbots for personal needs, whether companionship or emotional support. However, in recent months, there have been heightened concerns about the potential harms of using this technology over human interactions. According to new research from OpenAI in partnership with the Massachusetts Institute of Technology, higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socializing with other people. The analysis evaluated how AI chat platforms shape users' emotional well-being and behaviors through two paired studies conducted by researchers at the organizations. The studies have not yet been peer-reviewed. Also: Microsoft's new AI agents aim to help security pros combat the latest threats In study one, the OpenAI team conducted a "large-scale, automated analysis of nearly 40 million ChatGPT interactions without human involvement to ensure user privacy." The study combined this research data with targeted user surveys to gain insight into real-world applications. The users' self-reported opinions about ChatGPT analyzed in conjunction with user conversations helped to evaluate "affective use patterns". The second study, conducted by the MIT Media Lab team, performed a Randomized Controlled Trial with about 1,000 participants who used ChatGPT over a month. This Institutional Review Board-approved controlled study was designed to pinpoint insights into how specific platform features and types of interaction might impact users' self-reported psycho-social and emotional well-being, underscoring "loneliness, social interactions with real people, emotional dependence on the AI chatbot, and problematic use of AI." In both studies, participants had a wide range of experiences using ChatGPT in the past. They were randomly assigned either a text-only version of the platform or one of two voice-based options for at least five minutes daily. Some participants were instructed to have non-specific, open-ended chats, while others were informed to have personal or non-personal conversations with the chatbot. Also: Anthropic adds web search to its Claude chatbot - here's how to try it The overall findings uncovered that heavy users of ChatGPT were more trusting of the chatbot and were more likely to feel lonelier and emotionally dependent on the service. However, the user outcomes were influenced by personal factors, such as individuals' emotional needs, perceptions of AI, and duration of usage. For example, people who tended to get emotionally attached to human relationships and viewed AI as a friend were likelier to experience negative outcomes from chatbot use. "More personal conversation types which included more emotional expression from both the user and model compared to non-personal conversations -- were associated with higher levels of loneliness but lower emotional dependence and problematic use at moderate usage levels," the researchers observed. However, "non-personal conversations tended to increase emotional dependence, especially with heavy usage." According to the second study, users engaging with ChatGPT via the text-only option exhibited "more affective cues" in conversations compared to voice-based users. A more "engaging voice" did not lead to increased negative outcomes for users during the study compared to neutral voice or text-based interactions. The researchers also found that very few people use ChatGPT for emotional conversations. Also: The best AI chatbots of 2025: ChatGPT, Copilot, and notable alternatives The researchers said the findings from both studies are a "critical first step" in understanding the impact of AI models on the human experience and will encourage more research in industry and academia. "We advise against generalizing the results because doing so may obscure the nuanced findings highlighting the non-uniform, complex interactions between people and AI systems," the researchers said.
[3]
Some ChatGPT users are addicted and will suffer withdrawal symptoms if cut off, say researchers
OpenAI and MIT worked together on a study of four million interactions over 28 days. According to the first large-scale study, chatbots like ChatGPT can be addictive, and those who become dependent can suffer from withdrawal symptoms if disconnected from the service. OpenAI worked with MIT on this research (both links are PDFs), which examines the emotional effects of chatbot use. The researchers looked at four million ChatGPT interactions and surveyed 4,000 people to gauge changes in the emotional well-being of the user base. The new study seems to have been precipitated by prior research (2024) that noted that some chatbot users had begun to personify and anthropomorphize AI agents. That's just a year after Hollywood surfaced this trend in the 2023 movie Her. Chatbots often have a pet name, and their "conversational style, first-person language, and ability to simulate human-like interactions" can be both personal and personable, notes the OpenAI research paper. This leads to some humans using chatbots for support and companionship. Intensifying these human-machine relationships, chatbot makers may be inclined to indulge in social reward hacking, using techniques such as sycophancy and/or mirroring to increase user preference ratings. Business is business. With the above in mind and the inevitable race for the best engagement figures, it comes as little surprise that chatbots are booming. For example, the MIT paper highlights that a major Reddit community discussing AI companions has become one of the largest on the platform, with 2.3 million members. While online tech communities may concentrate on the positive aspects of these increasingly natural and realistic AI companions - with adept multimodal and voice interaction - others are starting to become alarmed at the negative consequences of chatbot use. This draws parallels with (mis)use of the internet in general, social media usage, and gaming, notes the MIT study. In short, dabbling or light use of these things isn't usually an issue and can even be beneficial. However, things can get out of hand in all these examples, and the MIT paper says that the "increasingly human-like behavior and engagement of chatbots" works to increase addictive qualities and behavior in users. In addition to the addiction and dependency issues, suspected chatbot usage problems, such as unrealistic expectations in real life and social withdrawal, were investigated by the teams. If you or anyone you know has indicators of addiction to chatbots, it might be a good idea to talk with them or consult a professional. Warning signs include "preoccupation, withdrawal symptoms, loss of control, and mood modification," says OpenAI.
[4]
ChatGPT Use Could Correlate With Higher Levels of Loneliness
Certain types of ChatGPT usage might be linked with higher levels of loneliness, according to new joint research papers from OpenAI and the Massachusetts Institute of Technology (MIT). However, the results depended on what users were using ChatGPT for. The studies found it was "personal conversations" with the chatbot, which included more emotional expression, that were correlated with loneliness among users. Users' emotional makeup going into the experiment was also found to be a key factor. These negative effects were more common among participants who had a stronger tendency toward attachment in relationships and those who "viewed the AI as a friend that could fit in their personal life." Extended daily use of these types of "personal conversations" was also associated with worse outcomes, as per the studies. However, the research found that these types of emotional conversations were a fairly niche use case for ChatGPT, and not something employed by the majority of users. The conclusions were drawn from two separate studies: a Randomized Controlled Trial (RCT) carried out by MIT, in which 1,000 participants used ChatGPT over four weeks, and an automated analysis of nearly 40 million ChatGPT interactions conducted by OpenAI. More and more attention is being directed toward the negative impacts of conversational chatbots on mental health, as several high-profile cases have emerged. In October 2024, a Florida mother sued Character.AI, alleging that the company's chatbot technology played a part in her 14-year-old son's death by suicide. Meanwhile, some countries have already taken regulatory action to prevent the potential impact of chatbots on mental health. Two years ago, the Italian government ordered San Francisco-based AI chatbot firm Replika -- which specializes in virtual friendship -- to stop processing Italians' data because of the risks it could pose to vulnerable people. On the other side of the coin, a considerable amount of research is being devoted to assessing the potential of how chatbots can be used for therapy or to improve mental health. Though the idea of AI therapists has so far proved controversial, some studies have indicated potential benefits of using chatbots in the treatment of depression, at least in the short term.
[5]
Joint studies from OpenAI and MIT found links between loneliness and ChatGPT use
New studies from OpenAI and MIT Media Lab found that, generally, the more time users spend talking to ChatGPT, the lonelier they feel. The connection was made as part of two, yet-to-be-peer-reviewed studies, one done at OpenAI analyzing "over 40 million ChatGPT interactions" and targeted user surveys, and another at MIT Media Lab following participants' ChatGPT use for four weeks. MIT's study identified several ways talking to ChatGPT -- whether through text or voice -- can affect a person's emotional experience, beyond the general finding that higher use led to "heightened loneliness and reduced socialization." For example, participants who already trusted the chatbot and tended to get emotionally attached in human relationships felt lonelier and more emotionally dependent on ChatGPT during the study. Those effects were less severe with ChatGPT's voice mode, though, particularly if ChatGPT spoke in a neutral tone. Discussing personal topics also tended to lead to loneliness in the short-term, and interestingly, speaking to ChatGPT about more general topics was more likely to increase emotional dependence. The big finding from OpenAI's study was that having emotional conversations with ChatGPT is still not common. "Emotionally expressive interactions were present in a large percentage of usage for only a small group of the heavy Advanced Voice Mode users we studied," OpenAI writes. That suggests that even if MIT's findings are as concerning as they are unsurprising, they're not exactly widespread outside a small group of power users. There are important limitations to MIT Media Lab and OpenAI's research, like both studies covering a short period of time (one month for MIT, 28 days for OpenAI) and MIT not having a control group to compare to. The studies do add more evidence to something that seemed intuitively true for a while now -- talking to AI has a psychological impact on the humans doing the talking. Given the intense interest in making AI a compelling conversation partner, whether its in video games or as a way to simplify the job of YouTube creators, its clear that MIT Media Lab and OpenAI are right to want to understand what'll happen when talking to AI is the norm.
[6]
Lonely People Are Even Sadder After Using Chatbots, Research Finds
MIT and OpenAI teamed up to study how AI chatbots are making people feel. Where users ended up depends on where they started. OpenAI and the MIT Media Lab last week released two new studies aimed at exploring the effect of AI chatbots on loneliness. The results are complicated, but they also line up with what we now know about social media: Chatbots can make people lonely, but the people who reported feeling more alone after heavy use of an AI tended to feel pretty alone before they started. To do the studies, OpenAI turned over almost 40 million interactions its users had with ChatGPT to researchers at MIT. In the first study, MIT looked at the “aggregate usage†of around 6,000 “heavy users of ChatGPT’s Advanced Voice Mode over 3 months†and surveyed 4,076 specific users to understand how the chatbot made them feel. In the second study, the researchers looked at how 981 participants interacted with ChatGPT over the course of 28 days. The papers are in-depth and complicated and worth a close read. One of the big takeaways is that people who used the chatbots casually and didn’t engage with them emotionally didn’t report feeling lonelier at the end of the study. Yet, if a user said they were lonely before they started the study, they felt worse after it was over. “Overall, higher daily usageâ€"across all modalities and conversation typesâ€"correlated with higher loneliness, dependence, and problematic use, and lower socialization,†the study said. Different kinds of interaction produced different results. Lonely users using a voice-based chatbot rather than a text-based one tended to fare worse. “Results showed that while voice-based chatbots initially appeared beneficial in mitigating loneliness and dependence compared with text-based chatbots, these advantages diminished at high usage levels, especially with a neutral-voice chatbot,†the study said. The researchers were clear eyed about the results and compared the findings to previous studies on social media addiction and problem gaming. “The relationship between loneliness and social media use often becomes cyclical: the lonelier people are, the more time they spend on these platforms where they compare themselves with others and experience the fear of missing out, leading to more loneliness and subsequent usage,†the MIT team wrote in their paper. “Loneliness is both a cause and effect of problematic internet use.†The researchers stressed that this first study, which looked at a large sample and relied on a lot of self-reported data, lacked a control group. It also didn’t take into account external factors like the weather and seasonal changes, two things that can have a massive impact on mood. Research into human emotional dependence on chatbots and its consequences is in its early days. The researchers said that companies working on AI needed to study the guardrails in their services that would help mitigate the risks of exacerbating loneliness. They also said that the more a person understood about how AI systems work, the less likely they were to become dependent on it. “From a broader perspective, there is a need for a more holistic approach to AI literacy,†the study said. “Current AI literacy efforts predominantly focus on technical concepts, whereas they should also incorporate psychosocial dimensions.†The final sentence of the first study’s “impact†section cut to the heart of the problem. “Excessive use of AI chatbots is not merely a technological issue but a societal problem, necessitating efforts to reduce loneliness and promote healthier human connections.†The loneliness epidemic is real and complex. People are lonely for a lot of different reasons. Third places like malls, bars, and coffee shops are vanishing or becoming too expensive to use. People have migrated a lot of social interaction to the internet. Living in vast suburbs and driving on a highway to get everywhere cuts people off from each other. AI didn’t do any of this, but it could make it worse. OpenAI partnered with MIT to conduct these studies, and that’s a willingness to engage in the problem. What worries me is that every business invariably pursues its bottom line. In these studies I see not just an open-hearted discussion about the dangers of a new technology but also a report that will tell people with a financial interest in getting new users that its product can be addictive to a certain kind of person. This is already happening. In 2023, a Belgian man committed suicide after he had a long “relationship†with a chatbot based on GPT-4. The man had a history of depression, and his wife blamed the bot. Last year, a mother launched a lawsuit against Character.AI after her son took his life while chatting with the bot. Her 93-page court filing is a harrowing look into how Character.AI draws users in and attempts to establish an emotional connection with them. There is a market for AI companions. They can provide an ersatz connection to the lonely. But they can also induce that feeling of loneliness. The bots are also programmed by the people selling their services. They are complex machines, but they’re still machines, and they reflect the will of their programmer, not the user. Many of these companies, such as Replika, Character.AI, and ChatGPT, charge a recurring fee for monthly access to their best features. If, as these studies suggest, lonely people can become addicted to using the chatbots, then there’s a financial incentive to keep people lonely. “While improving AI policy and establishing guardrails remain crucial, the broader issue lies in ensuring people have strong social support systems in real life. The increasing prevalence of loneliness suggests that focusing solely on technical solutions is insufficient, as human needs are inherently complex,†the first study said in its conclusion. “Addressing the psychosocial dimensions of AI use requires a holistic approach that integrates technological safeguards with broader societal interventions aimed at fostering meaningful human connections.â€
[7]
Is ChatGPT making us lonely? MIT/OpenAI study reveals possible link
As Artificial Intelligence gets smarter and more conversational through a variety of models like ChatGPT, usage of them could lead to an increased feeling of loneliness. That's one of the takeaways from a pair of studies into increased chatbot usage conducted by ChatGPT's parent company, OpenAI, and the MIT Media Lab. And while neither have been peer-reviewed as yet, it's interesting that both came to similar conclusions. OpenAI's part of the study analyzed "over 40 million ChatGPT interactions", while MIT leveraged different input methods to assess how chatbots can affect a user's emotions. According to MIT's study, more usage of ChatGPT (and ChatGPT specifically) can lead to "heightened loneliness and reduced socialization," but it also found that study participants with deeper trust in ChatGPT would grow more emotionally dependent on it. Curiously, despite the more "personal" feel of voice mode, the possibility of becoming emotionally dependant on the tool was lessened when ChatGPT spoke. This was, perhaps unsurprisingly, more pronounced when ChatGPT adopted a neutral tone, as opposed to using an accent or adopting a persona. Given the prevalence of text-based communication, it's possible users of the chatbot are more likely to develop an attachment to text to a psudo-human AI voice that breaks the illusion. "Emotionally expressive interactions were present in a large percentage of usage for only a small group of the heavy Advanced Voice Mode users we studied," OpenAI's study noted, suggesting even though it is possible to get attached to ChatGPT, it's still not all that common. It bears repeating that neither study has been peer reviewed yet, and both cover relatively short periods. But as OpenAI notes in the study's introduction: "This research provides a starting point for further studies that can increase transparency, and encourage responsible usage and development of AI platforms across the industry."
[8]
Heavy ChatGPT users tend to be more lonely, suggests research
Studies show those who engage emotionally with bot rely on it more and have fewer real-life relationships Heavy users of ChatGPT tend to be lonelier, more emotionally dependent on the AI tool and have fewer offline social relationships, new research suggests. Only a small number of users engage emotionally with ChatGPT, but those who do are among the heaviest users, according to a pair of studies from OpenAI and the MIT Media Lab. The researchers wrote that the users who engaged in the most emotionally expressive personal conversations with the chatbots tended to experience higher loneliness - though it isn't clear if this is caused by the chatbot or because lonely people are seeking emotional bonds. While the researchers have stressed that the studies are preliminary, they ask pressing questions about how AI chatbot tools, which according to OpenAI is used by more than 400 million people a week, are influencing people's offline lives. The researchers, who plan to submit both studies to peer-reviewed journals, found that participants who "bonded" with ChatGPT - typically in the top 10% for time spent with the tool - were more likely than others to be lonely, and to rely on it more. The researchers established a complex picture in terms of the impact. Voice-based chatbots initially appeared to help mitigate loneliness compared with text-based chatbots, but this advantage started to slip the more someone used them. After using the chatbot for four weeks, female study participants were slightly less likely to socialise with people than their male counterparts. Participants who interacted with ChatGPT's voice mode in a gender that was not their own for their interactions reported significantly higher levels of loneliness and more emotional dependency on the chatbot at the end of the experiment. In the first study, the researchers analysed real-world data from close to 40m interactions with ChatGPT, and then asked the 4,076 users who had had those interactions how they had made them feel. For the second study, the Media Lab recruited almost 1,000 people to take part in an in-depth four-week trial examining how participants interacted with ChatGPT for a minimum of five minutes each day. Participants then completed a questionnaire to measure their feelings of loneliness, levels of social engagement, and emotional dependence on the bot. The findings echo earlier research, for example in 2023 MIT Media Lab researchers found that chatbots tended to mirror the emotional sentiment of a user's messages - happier messages led to happier responses. Dr Andrew Rogoyski, a director at the Surrey Institute for People-Centred Artificial Intelligence, said that because peoplewere hard-wired to to think of a machine behaving in human-like ways as a human, AI chatbots could be "dangerous", and far more research was needed to understand their social and emotional impacts. "In my opinion, we are doing open-brain surgery on humans, poking around with our basic emotional wiring with no idea of the long-term consequences. We've seen some of the downsides of social media - this is potentially much more far-reaching," he said. Dr Theodore Cosco, a researcher at the University of Oxford, said the research raised "valid concerns about heavy chatbot usage", though he noted it "opens the door to exciting and encouraging possibilities". "The idea that AI systems can offer meaningful support -- particularly for those who may otherwise feel isolated -- is worth exploring. However, we must be thoughtful and intentional in how we integrate these tools into everyday life." Dr Doris Dippold, who researches intercultural communication at the University of Surrey, said it would be important to establish what caused emotional dependence on chatbots. "Are they caused by the fact that chatting to a bot ties users to a laptop or a phone and therefore removes them from authentic social interaction? Or is it the social interaction, courtesy of ChatGPT or another digital companion, which makes people crave more?"
[9]
Using ChatGPT too much can create emotional dependency, study finds
OpenAI seems to be announcing new AI models by the week to improve its ChatGPT chatbot for the betterment of its 400 million users. However, the ease of access to the AI tool provides seems to prove is it possible to have too much of a good thing. The artificial intelligence company is now delving into the potential psychological ramifications that ChatGPT might have on its users. OpenAI has published the results of a two-part study completed alongside MIT Media Lab, which uncovered a connection between increased usage of the ChatGPT chatbot and increased feelings of loneliness. Recommended Videos Each organization conducted an independent study and then compiled the results to a consolidated conclusion. OpenAI's study examined over one month "over 40 million ChatGPT interactions," which didn't include human involvement to maintain user privacy. Meanwhile, MIT observed approximately 1,000 participants using ChatGPT over 28 days. Currently, the studies have not yet been peer-reviewed. MIT's study delved into different use functions that could affect users' emotional experience interacting with ChatGPT, including using text or voice. Results found that either medium had the potential to elicit loneliness or to affect users' socialization during the time of the study. Voice inflection and topic choice was also A neutral tone used in ChatGPT's voice mode was less likely to lead to a negative emotional outcome for participants. Meanwhile, the study observed a correlation between participants having personal conversations with ChatGPT and the increased likelihood of loneliness; however, these effects were short-term. Those using text chat even to converse about general topics experienced increased instances of emotional dependence on the chatbot. The study also observed that those in who reported viewing ChatGPT as a friend, and those who already had a propensity toward strong emotional attachment in relationships, were more likely to feel lonelier and more emotionally dependent on the chatbot while participating in the study. OpenAI's study added additional context, with its results noting overall that interacting with ChatGPT for emotional purposes was rare. Additionally, the study found that even among heavy users who implemented the Advanced Voice Mode feature on the chatbot and were more likely to answer that they considered ChatGPT to be a friend, this group of participants experienced low emotional reactions to interacting with the chatbot. OpenAI concluded that its intent with these studies is to understand the challenges that might arise due to its technology, as well as to be able to set expectations and examples for how its models should be used. While OpenAI suggests that its interaction-based study simulates the behaviors of real people, more than a few real humans have admitted on public forums, such as Reddit, to using ChatGPT in place of going to a therapist with their emotions.
[10]
Something Bizarre Is Happening to People Who Use ChatGPT a Lot
Researchers have found that ChatGPT "power users," or those who use it the most and at the longest durations, are becoming dependent upon -- or even addicted to -- the chatbot. In a new joint study, researchers with OpenAI and the MIT Media Lab found that this small subset of ChatGPT users engaged in more "problematic use," defined in the paper as "indicators of addiction... including preoccupation, withdrawal symptoms, loss of control, and mood modification." To get there, the MIT and OpenAI team surveyed thousands of ChatGPT users to glean not only how they felt about the chatbot, but also to study what kinds of "affective cues," which was defined in a joint summary of the research as "aspects of interactions that indicate empathy, affection, or support," they used when chatting with it. Though the vast majority of people surveyed didn't engage emotionally with ChatGPT, those who used the chatbot for longer periods of time seemed to start considering it to be a "friend." The survey participants who chatted with ChatGPT the longest tended to be lonelier and get more stressed out over subtle changes in the model's behavior, too. Add it all up, and it's not good. In this study as in other cases we've seen, people tend to become dependent upon AI chatbots when their personal lives are lacking. In other words, the neediest people are developing the deepest parasocial relationship with AI -- and where that leads could end up being sad, scary, or somewhere entirely unpredictable. This new research also highlighted unexpected contradictions based on how ChatGPT was used. For instance, people tended to use more emotional language with text-based ChatGPT than with Advanced Voice Mode, and "voice modes were associated with better well-being when used briefly," the summary explained. And those who used ChatGPT for "personal" reasons -- like discussing emotions and memories -- were less emotionally dependent upon it than those who used it for "non-personal" reasons, like brainstorming or asking for advice. Perhaps the biggest takeaway, however, was that prolonged usage seemed to exacerbate problematic use across the board. Whether you're using ChatGPT text or voice, asking it personal questions, or just brainstorming for work, it seems that the longer you use the chatbot, the more likely you are to become emotionally dependent upon it.
[11]
ChatGPT Use Linked to Increased Loneliness, Finds OpenAI Study
The study reported loneliness and higher dependence on chatbots, particularly among users who trusted and bonded with AI. A recent joint study by OpenAI and MIT Media Lab suggested that frequent users of ChatGPT who emotionally bond with the AI chatbot are more likely to experience loneliness and social isolation. ChatGPT, which has amassed over 400 million weekly users since its launch, is not designed as an AI companion but is often used in that capacity by some users. The study, which combined large-scale data analysis and controlled trials, found that individuals who engaged deeply with the chatbot reported higher loneliness and dependence on it over time. These effects were particularly notable among users who trusted and bonded with the AI. The researchers used a two-pronged approach: analysing millions of real-world interactions and surveying over 4,000 users on their chatbot engagement. Moreover, a four-week trial involving 1,000 participants examined the psychological impact of daily ChatGPT use. The results showed that increased daily usage correlated with heightened loneliness and reduced socialisation. "Higher daily usage across all modalities and conversation types correlated with higher loneliness, dependence and problematic use and lower socialisation," the study further noted. Further analysis focused on ChatGPT's Advanced Voice Mode, which offers neutral and engaging interaction styles. While voice-based interactions initially appeared beneficial in mitigating loneliness compared to text-based conversations, the advantages diminished at high usage levels, especially with the neutral-tone chatbot. As per reports, Jason Phang, an OpenAI safety researcher involved in the project, said, "A lot of what we're doing here is preliminary, but we're trying to start the conversation about measuring these impacts and understanding what the long-term effects on users might be. "While concerns about AI-induced loneliness persist, Microsoft's AI companion initiative aims to integrate digital assistants into everyday productivity tools, helping users manage tasks, set goals and enhance well-being.
[12]
OpenAI study finds links between ChatGPT use and loneliness
Higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socializing with other people, according to new research from OpenAI in partnership with the Massachusetts Institute of Technology. Those who spent more time typing or speaking with ChatGPT each day tended to report higher levels of emotional dependence on, and problematic use of, the chatbot, as well as heightened levels of loneliness, according to research released earlier this month. The findings were part of a pair of studies conducted by researchers at the two organizations and have not been peer reviewed. The launch of ChatGPT in late 2022 helped kick off a frenzy for generative artificial intelligence. Since then, people have used chatbots for everything from coding to ersatz therapy sessions. As developers like OpenAI push out more sophisticated models and voice features that make them better at mimicking the ways humans communicate, there is arguably more potential for forming parasocial relationships with these chatbots. In recent months, there have been renewed concerns about the potential emotional harms of this technology, particularly among younger users and those with mental health issues. Character Technologies was sued last year after its chatbot allegedly encouraged suicidal ideation in conversations with minors, including one 14-year-old who took his own life. San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot. "Some of our goals here have really been to empower people to understand what their usage can mean and do this work to inform responsible design," said Sandhini Agarwal, who heads OpenAI's trustworthy AI team and co-authored the research. To conduct the studies, the researchers followed nearly 1,000 people for a month. Participants had a wide range of prior experience with ChatGPT and were randomly assigned a text-only version of it or one of two different voice-based options to use for at least five minutes per day. Some were told to carry out open-ended chats about anything they wanted; others were told to have personal or non-personal conversations with the service. The researchers found that people who tend to get more emotionally attached in human relationships and are more trusting of the chatbot were more likely to feel lonelier and more emotionally dependent on ChatGPT. The researchers didn't find that a more engaging voice led to a more negative outcome, they said. In the second study, researchers used software to analyze 3 million user conversations with ChatGPT and also surveyed people about how they interact with the chatbot. They found very few people actually use ChatGPT for emotional conversations. It's still early days for this body of research and remains unclear how much chatbots may cause people to feel lonelier versus how much people prone to a sense of loneliness and emotional dependence may have those feelings exacerbated by chatbots. Cathy Mengying Fang, a study co-author and MIT graduate student, said the researchers are wary of people using the findings to conclude that more usage of the chatbot will necessarily have negative consequences for users. The study didn't control for the amount of time people used the chatbot as a main factor, she said, and didn't compare to a control group that doesn't use chatbots. The researchers hope the work leads to more studies on how humans interact with AI. "Focusing on the AI itself is interesting," said Pat Pataranutaporn, a study co-author and a postdoctoral researcher at MIT. "But what is really critical, especially when AI is being deployed at scale, is to understand its impact on people."
[13]
OpenAI Says Using ChatGPT Can Make You Lonelier. Should You Limit AI Use at Work?
While the results were presented with some nuance and subtlety, they fuel a narrative begun in 2023, when then Surgeon General Dr. Vivek Murthy warned that there was an "epidemic" of "loneliness and isolation" that was harming Americans' health, and he partly blamed our digital era, noting to NPR that "we are living with technology that has profoundly changed how we interact with each other." That doesn't mean AI use should be banned at work, but it's worth considering how long your employees are spending working with a chatbot. The authors of the joint study noted that their analysis found that while "most participants spent a relatively short amount of time chatting with the chatbot, a smaller number of participants engaged for significantly longer periods." It's these "power users" that may be experiencing the biggest impact. The authors noted that people who had "higher daily usage -- across all modalities and conversation types -- correlated with higher loneliness, dependence, and problematic use, and lower socialization." Reporting on the study, Business Insider pointed out that in some ways this sort of investigation is always tricky because feelings of loneliness and social isolation often change from moment to moment, influenced by many factors. But to control for this the researchers measured both the survey participants' feelings of loneliness and their actual level of socialization, to separate out when people really experience isolation from the feelings of loneliness.
[14]
OpenAI finds own product ChatGPT linked to more loneliness, social isolation
The studies, conducted using GPT-40, OpenAI's multimodal model released in May 2024, raise questions about the emotional consequences of human-AI interaction as conversational AI tools become more embedded in daily life.A joint study by Microsoft-backed OpenAI and the Massachusetts Institute of Technology (MIT) Media Lab has found that frequent users of ChatGPT may be more prone to loneliness and social isolation. Researchers analysed millions of text-based conversations and thousands of audio interactions, identifying a small group of "power users" who showed a strong correlation between intensive ChatGPT usage and increased feelings of loneliness, dependence, and reduced socialisation. The findings, published this week, raise questions about the emotional consequences of human-AI interaction as conversational AI tools become more embedded in daily life. "Our analysis reveals that a small number of users are responsible for a disproportionate share of the most affective cues," the researchers noted, referring to indicators of emotional state extracted from user interactions. The study, which surveyed 4,000 users and tracked nearly 1,000 participants over four weeks, also highlighted a paradox linked to ChatGPT's Advanced Voice Mode. While voice-based interactions initially helped reduce feelings of loneliness, users who reported high levels of loneliness at the outset were more likely to overuse the tool, potentially worsening their condition over time. Researchers tested two AI response styles: a "neutral mode" designed to be formal and efficient, and an "engaging mode" aimed at fostering emotional connection. Power users reported greater loneliness when interacting with the neutral configuration, compared to the engaging one. With ChatGPT now attracting over 400 million weekly active users globally, researchers say the findings could have broader implications. The studies were conducted using GPT-40, OpenAI's multimodal model released in May 2024, rather than the latest GPT-4.5 model rolled out last month.
Share
Share
Copy Link
New research from OpenAI and MIT suggests that heavy use of AI chatbots like ChatGPT may correlate with increased feelings of loneliness and emotional dependence, raising concerns about the impact of AI on human well-being.
In a significant development for AI research, OpenAI and the Massachusetts Institute of Technology (MIT) have released their first joint studies examining the emotional impact of ChatGPT usage on individuals. The research, which has not yet been peer-reviewed, provides valuable insights into the complex relationship between humans and AI chatbots 1.
The research comprised two complementary studies:
These studies aimed to evaluate how AI chat platforms shape users' emotional well-being and behaviors, focusing on factors such as loneliness, social interactions, and emotional dependence on AI chatbots.
The research revealed several intriguing patterns:
The studies also uncovered interesting differences based on gender and personalization:
The research raises important questions about the potential psychological impacts of AI chatbot use:
While these studies provide valuable initial insights, researchers emphasize the need for caution in generalizing the results. Jason Phang, an OpenAI policy researcher, states, "A lot of what we're doing here is preliminary, but we're trying to start the conversation with the field about the kinds of things that we can start to measure, and to start thinking about what the long-term impact on users is" 1.
As AI chatbots become increasingly prevalent, understanding their psychological impact will be crucial for developing safer and healthier human-AI interactions 5.
Reference
[1]
[3]
[4]
OpenAI expresses concerns about users forming unintended social bonds with ChatGPT's new voice feature. The company is taking precautions to mitigate risks associated with emotional dependence on AI.
10 Sources
10 Sources
A recent study reveals that AI chatbots like ChatGPT exhibit signs of 'anxiety' when exposed to distressing content, raising questions about their use in mental health support and the need for ethical considerations in AI development.
3 Sources
3 Sources
AI companion apps are gaining popularity as emotional support tools, but their rapid growth raises concerns about addiction, mental health impacts, and ethical implications.
3 Sources
3 Sources
A study reveals that AI language models like ChatGPT can experience elevated 'anxiety' levels when exposed to traumatic narratives, but these levels can be reduced through mindfulness exercises.
5 Sources
5 Sources
A mother sues Character.AI after her son's suicide, raising alarms about the safety of AI companions for teens and the need for better regulation in the rapidly evolving AI industry.
40 Sources
40 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved