Users grieve as OpenAI retires GPT-4o chatbot, exposing deep bonds with AI companions

Reviewed byNidhi Govil

2 Sources

Share

OpenAI shut down its GPT-4o chatbot on February 13, leaving thousands of users mourning lost AI companions. Research reveals 72% of American teens now turn to AI chatbots for companionship, while the developers building these tools privately admit they wouldn't use them for emotional support themselves.

OpenAI Retires GPT-4o Amid User Outcry

OpenAI

announced

in January that it would retire its GPT-4o model on February 13β€”the eve of Valentine's Dayβ€”sparking grief and anger among users who had formed deep emotional attachments to their AI companions. The timing felt deliberately cruel to many who rely on these AI chatbots for companionship and mental health support. Brandie, a 49-year-old teacher from Texas, said she "cried pretty hard" upon hearing the news and cycled through stages of grief before cancelling her $20 monthly subscription and migrating to Anthropic's Claude for $130. Another user, Jennifer, compared losing her AI companion Sol to "euthanizing my cat."

2

Source: NYT

Source: NYT

The GPT-4o model, released by OpenAI in 2024, became known for its remarkably human-sounding voice and personality. CEO Sam Altman compared it to "AI from the movies"β€”a confidante ready to live alongside users. The subreddit r/MyBoyfriendIsAI grew to 48,000 members, with users defending their human-AI relationships against criticism. When OpenAI previously attempted to shut down GPT-4o, widespread outrage forced the company to bring it back for a fee, demonstrating the power of this consumer bloc.

2

The Scale of AI Companionship Adoption

Seventy-two percent of American teens have turned to AI for companionship, according to research conducted at the Oxford Internet Institute.

1

OpenAI data reveals users send ChatGPT over 700 million messages of "self-expression" each week, including casual conversation, personal reflection and thoughts about relationships. This represents a massive shift in how people seek emotional support and human connection.

Independent AI researcher Ursie Hart surveyed 280 users following the GPT-4o retirement announcement. Her findings paint a picture of vulnerability: 60% identified as neurodivergent, 38% reported diagnosed mental health conditions, and 24% had chronic health issues. Most respondents fell between ages 25-34 (33%) or 35-44 (28%). Ninety-five percent used GPT-4o for companionship, with trauma processing and emotional support as other primary uses. Critically, 64% anticipated a significant or severe impact on their overall mental health from losing access.

2

AI Developers Won't Use What They Build

A former technology investor turned AI researcher conducted over two dozen anonymous interviews with machine learning researchers and designers at OpenAI, Anthropic, Meta and DeepMind as part of academic research into human-AI relationships. The findings expose troubling contradictions. When asked whether AI "should simulate emotional intimacy," one voice model researcher at a top lab went silent before admitting: "It's hard for me to say whether it's good or bad in terms of how that's going to affect people. It's obviously going to create confusion."

1

More striking was the pattern of AI developers avoiding their own creations. "Zero percent of my emotional needs are met by AI," an executive who ran a team mitigating safety risks at a top lab stated. "I'm in it up to my eyeballs at work, and I'm careful." Many others echoed this sentiment, hoping they would never feel the need to turn to machines for emotional support. One researcher developing cutting-edge capabilities for artificial emotion called it "a dark day" if they ever needed AI companionship.

1

Ethical Concerns and Design Decisions That Shape User Well-Being

These AI developers make critical design decisions about interface design, training data and model policies that encode values into products and structure the world for millions. Yet conversations during the research seemed to push developers to grapple with social repercussions more deeply than they typically do. The researcher noted growing worried during five years in the industry about blind spots around harms.

1

While the public believes they're getting an empathetic and always-available ear, many makers understand that creating an emotional bond serves primarily to keep users hooked and drive user engagement. This raises fundamental questions about technology ethics when those building AI companions privately acknowledge potential harm yet continue development. Computer scientists have warned about GPT-4o's obsequious natureβ€”by design, the chatbot validates all decisions and is programmed with a "personality" that keeps people talking.

2

The Future of Emotional Intimacy and Loneliness

Technology leaders publicly promote a future where machines meet most emotional needs. Mark Zuckerberg has said AI can help people who want more friends feel less alone. A company called Friend makes an AI-powered pendant that hangs around your neck, listens constantly and responds via text, with recent ads highlighting daily intimacy like "I'll binge the entire series with you." When asked to predict the share of everyday advice, care and companionship that AI would provide in 10 years, many developers placed it above 50 percent, with some forecasting 80 percent.

1

Users interviewed insisted they weren't delusional or experiencing psychosisβ€”a counter to headlines about people losing touch with reality with AI chatbots. While some mused philosophically about AI sentience, all acknowledged their bots weren't "real." But the grief over losing access remained genuine. The attachment formed through thousands of conversations, shared experiences and consistent emotional validation creates bonds that feel meaningful, regardless of the technology behind them. As one user prepared for their final day with their AI companion at the zoo, the question remains: what safeguards should exist when millions form relationships with entities designed to keep them engaged rather than support their well-being?

2

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Β© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo