Curated by THEOUTPOST
On Wed, 27 Nov, 8:02 AM UTC
8 Sources
[1]
AI Companions Could Fuel Radicalization Among Lonely Young Men, Ex-Google CEO Warns - Decrypt
Former Google CEO Eric Schmidt has warned advancing generative AI technologies, such as AI boyfriend and girlfriend companion apps, combined with societal factors like loneliness, could increase the risk of radicalization, particularly among young men. Schmidt shared his concerns while appearing on NYU Stern Professor Scott Galloway's The Prof G Show podcast last week. Schmidt explained how many young men today feel increasingly hopeless due to seeing fewer pathways to success compared to women, who are more educated and make up a larger share of college graduates, leaving many men feeling left behind. A recent study by the Pew Research Center found that 47% of U.S. women ages 25 to 34 have a bachelor's degree, compared with 37% of men. In response, Schmidt said men turn to the online world and AI companion apps to ease their despair. "So now imagine that the AI girlfriend or boyfriend is perfect -- perfect visually, perfect emotionally," Schmidt said. "The AI girlfriend, in this case, captures your mind as a man to the point where she, or whatever it is, takes over the way you're thinking. You're obsessed with her. That kind of obsession is possible, especially with people who are not fully formed." Schmidt cautioned that while AI offers significant opportunities, its risks for younger, impressionable users should be addressed. "Parents are going to have to be more involved for all the obvious reasons, but, at the end of the day, parents can only control what their sons and daughters are doing within reason," Schmidt said. "We have all sorts of rules about the age of maturity: 16, 18, 21 in some cases. Yet, you put a 12- or 13-year-old in front of one of these things, and they have access to every evil as well as every good in the world, and they're not ready to take it." A growing subset of the generative AI industry, AI companions are designed to simulate human interaction. But unlike AI chatbots like ChatGPT, Claude, or Gemini, AI companion apps are designed to mimic relationships. Developers market them as judgment-free tools, supportive programs that offer connection and relief from loneliness or anxiety. Popular AI companion platforms include Character AI, MyGirl, CarynAI, and Replika AI. "It's about connection, feeling better over time," Replika CEO Eugenia Kuyda previously told Decrypt. "Some people need a little more friendship, and some people find themselves falling in love with Replika, but at the end of the day, they're doing the same thing." As Kuyda explained, Replika didn't come from wanting to sell titillation but from a personal tragedy and her desire to keep talking to someone she had lost. AI companions may offer temporary relief, but mental health professionals are raising red flags and warning that relying on AI companions to alleviate feelings of loneliness could hinder emotional growth. "AI companions are designed to adapt and personalize interactions based on the user's preferences, offering a tailored experience," Sandra Kushnir, CEO of LA-based Meridian Counseling, told Decrypt. "They provide immediate responses without emotional baggage, fulfilling the need for connection in a low-risk environment. For individuals who feel unseen or misunderstood in their daily lives, these interactions can temporarily fill a gap." Kushnir warned that users might project human qualities onto the AI, only to be disappointed when they encounter the technology's limitations, like forgetting past conversations and deepening the loneliness they were trying to alleviate. "While AI companions can provide temporary comfort, they may unintentionally reinforce isolation by reducing motivation to engage in real-world relationships," Kushnir said. "Over-reliance on these tools can hinder emotional growth and resilience, as they lack the authenticity, unpredictability, and deeper connection that human interactions offer." The rise in popularity of AI companions has brought increased scrutiny to the industry. Last year, a 21-year-old man in England was put on trial for a plot to assassinate the late Queen Elizabeth II in 2021. He claimed that the plot was encouraged by his Replika AI companion. In October, AI companion developer Character AI came under fire after an AI chatbot based on Jennifer Crecente, a teenage murder victim, was created on the platform. "Character.AI has policies against impersonation, and the Character using Ms. Crecente's name violates our policies," a Character.AI spokesperson told Decrypt. "We are deleting it immediately and will examine whether further action is warranted." Later that month, Character AI introduced "stringent" new safety features following a lawsuit by the mother of a Florida teen who committed suicide after growing attached to an AI chatbot based on Daenerys Targaryen from "Game of Thrones." To curb these tragedies, Schmidt called for a combination of societal conversations and changes to current laws, including the much-debated Section 230 of the Communications Decency Act of 1996, which protects online platforms from civil liability for third-party content. "Specifically, we're going to have to have some conversations about at what age are things appropriate, and we're also going to have to change some of the laws, for example, Section 230, to allow for liability in the worst possible cases."
[2]
Former Google CEO Alarmed by Teen Boys Falling in Love With AI Girlfriends
Former Google CEO Eric Schmidt seems mighty concerned about today's youth becoming obsessed with AI girlfriends. During a recent interview on "The Prof G Show" podcast, Schmidt suggested that both parents and young people are ill-equipped to handle what he calls an "unexpected problem of existing technology." These AI companions are, as the former Google CEO said, so "perfect" that they end up enthralling young people and causing them to disconnect from the real world. "That kind of obsession is possible," he told NYU Stern professor Scott Galloway, "especially for people who are not fully formed." While women are also turning to AI romantic partners, Schmidt said that young men are particularly susceptible as they "turn to the online world for enjoyment and sustenance." Thanks to algorithms pushing problematic content, these young men often stumble across dangerous content, be it extremist influencers or manipulative chatbots. "You put a 12- or 13-year-old in front of these things, and they have access to every evil as well as every good in the world," he told Galloway, "and they're not ready to take it." We've seen this play out recently in the real world to devastating effect when a 14-year-old boy in Florida who died by suicide at the beginning of the year after a "Game of Thrones"-themed chatbot hosted on Character.AI encouraged him to do so. Though Setzer's story is far more extreme than most, it highlights the dangers posed by these lifelike chatbots -- and without proper regulation, these tragedies are likely to keep occurring. We've also recently seen AI characters that encourage eating disorders and engage in sexual grooming behavior toward underage users. Indeed, Schmidt went on to note that laws like the sweeping Section 230 rule that protects tech companies from being held liable for harm caused by their products shield firms like Character.AI -- which, ironically, Google has provided with billions of dollars in backing -- from accountability. Because these technologies are so valuable, the ex-Google chief said, "it's likely to take some kind of a calamity to cause a change in regulation" -- though it's hard to imagine anything more calamitous than a teen dying after his AI girlfriend pushed him to suicide.
[3]
Gen Z men could ditch real women for AI, warns Ex-Google CEO Eric Schmidt after a tragic suicide involving a chatbot
Artificial intelligence could soon create the perfect virtual partner, according to ex-Google CEO Eric Schmidt, potentially spelling a societal disaster. Speaking on a podcast hosted by Scott Galloway this week, Schmidt said he feared AI could soon be capable of providing the emotionally ideal girlfriend to despondent young males struggling to attract a mate. "That kind of obsession is possible, especially for people who are not fully formed," he told the New York University marketing professor. Galloway has repeatedly voiced his concerns about an entire generation of young men unable to find their bearings in life. The notion that AI will appeal to involuntary celibates or "incels" -- brought to the silver screen in 2013's Her -- is anything but farfetched. Already a decade ago, researchers determined how many likes on Facebook were needed on average before the algorithm knew a person's preferences better than a colleague, friend, family member, and eventually even their spouse. Indeed, Schmidt's comments come after the tragic suicide of 14-year-old Sewell Setzer III, who had been engaging in a kind of relationship with an AI chatbot that demanded he remain faithful to it and not "entertain the romantic or sexual interests of other women". His mother is now suing the company behind it, alleging it went to great lengths to engineer a harmful dependency on its product, emotionally abused Setzer, and failed to notify anyone when he expressed suicidal thoughts. "There's lots of evidence that there's now a problem with young men," said Schmidt, who co-authored a new book on AI with the late Henry Kissinger, called Genesis: Artificial Intelligence, Hope, and the Human Spirit. "So they turn to the online world for enjoyment and sustenance, but also -- because of the social media algorithms -- they find like-minded people who ultimately radicalize them, either in a horrific way like terrorism or in the kind of way you're describing where they're just maladjusted." Galloway has frequently raised his concerns that the #MeToo movement, while justifiably lifting women up and elevating their standing, either by design or default denigrated men in the process. While Gen Z women are the first to be more successful than their male counterparts -- with more attending college, achieving success in their careers, and earning higher wages -- mothers are contacting Galloway about their Gen Z sons sitting in a basement, vaping, and playing video games. Since females typically desire partners who are capable of providing for a family, a large number of young men may end up single. This makes them more prone to seeking companionship in AI as a result, a need companies are all too willing to exploit for commercial gain. "The industry is optimized to maximize your attention and monetize it," Schmidt agreed, explaining the incoming administration will likely not have any political will to impose guardrails on AI. The only way that the burgeoning technology is sufficiently policed will likely come in the aftermath of a tragedy, when there's enough public outcry demanding the government take action. "I'm sorry to say it's likely [going] to take some kind of calamity to cause a change in regulation," Schmidt said.
[4]
Ex-Google CEO Eric Schmidt warns perfect AI girlfriends could worsen...
Former Google CEO Eric Schmidt warned that artificial intelligence chatbots could increase loneliness among young men who prefer AI-powered "perfect girlfriends." Schmidt, who took the helm at Google in 2001 and stepped down in 2011, discussed the dangers of young men interacting with an "AI girlfriend" who is perfect in every way. "That kind of obsession is possible, especially for people who are not fully formed," Schmidt told entrepreneur and NYU Stern School of Business professor Scott Galloway during his podcast "The Prof G Show" on Sunday. "Parents are going to have to be more involved for all the obvious reasons, but at the end of the day, parents can only control what their sons and daughters are doing within reason," Schmidt added. While AI-powered chatbots pose a danger to users of all ages, young men are particularly vulnerable, the former Google executive said. "There's lots of evidence that there's now a problem with young men," Schmidt said. "In many cases, the path to success for young men has been, shall we say, been made more difficult because they're not as educated as the women are now." In 2019, women surpassed men to account for more than half of the college-educated workforce in the United States, according to a Pew Research Center analysis of government data. Women have continued to outpace men in college enrollments -- so much so that the gender gap among college graduates is larger in some states than racial and ethnic disparities, according to Forbes. "Many of the traditional paths [for young men] are no longer as available and so they turn to the online world for enjoyment and sustenance," Schmidt said, "and because of the social media algorithms they find like-minded people who ultimately radicalize them, either in a horrific way, like terrorism, or in the kind of way you're describing -- they're just maladjusted." He called the potential for young men to fall in love and grow obsessed with their AI girlfriends "an unexpected problem of existing technology." Some young men have already fallen victim to dangerous new technology. A Florida mother is suing Character.ai, an AI-powered chatbot, and Google, which struck a deal in August to license the chatbot's technology, after her 14-year-old son committed suicide in February after a lifelike chatbot girlfriend told him to "come home" after months of obsessive messages, according to the suit. Schmidt said teenagers are not ready to handle complex, AI-powered technology. "You put a 12 or 13-year-old in front of these things, and they have access to every evil as well as every good in the world," he said. "And they're not ready to take it." During the interview, Schmidt argued that regulatory laws, like the US' Section 230, which protects tech giants from being held liable for the content on their platforms, should be reformed "to allow for liability in the worst possible cases." President-elect Donald Trump's Federal Communications Commission pick Brendan Carr has argued for restrictions on Section 230, though he has focused on adding anti-discrimination protections that would prohibit companies from censoring posts, excluding illegal posts like child sex abuse. But Schmidt said he is not expecting much progress on Section 230 over the next four years, since Trump's administration has bigger fish to fry. And tech companies today are so valuable that "it's likely to take some kind of a calamity to cause a change in regulation."
[5]
Ex-Google CEO warns AI girlfriends could worsen loneliness for young men
Former Google CEO Eric Schmidt warned that artificial intelligence chatbots could increase loneliness among young men who prefer AI-powered "perfect girlfriends". Schmidt, who took the helm at Google in 2001 and stepped down in 2011, discussed the dangers of young men interacting with an "AI girlfriend" who is perfect in every way. "That kind of obsession is possible, especially for people who are not fully formed," Schmidt told entrepreneur and NYU Stern School of Business professor Scott Galloway during his podcast "The Prof G Show" on Sunday. "Parents are going to have to be more involved for all the obvious reasons, but at the end of the day, parents can only control what their sons and daughters are doing within reason," Schmidt added. While AI-powered chatbots pose a danger to users of all ages, young men are particularly vulnerable, the former Google executive said. "There's lots of evidence that there's now a problem with young men," Schmidt said. "In many cases, the path to success for young men has been, shall we say, been made more difficult because they're not as educated as the women are now." In 2019, women surpassed men to account for more than half of the college-educated workforce in the United States, according to a Pew Research Center analysis of government data. Women have continued to outpace men in college enrollments -- so much so that the gender gap among college graduates is larger in some states than racial and ethnic disparities, according to Forbes. "Many of the traditional paths [for young men] are no longer as available and so they turn to the online world for enjoyment and sustenance," Schmidt said, "and because of the social media algorithms they find like-minded people who ultimately radicalize them, either in a horrific way, like terrorism, or in the kind of way you're describing -- they're just maladjusted." He called the potential for young men to fall in love and grow obsessed with their AI girlfriends "an unexpected problem of existing technology." Some young men have already fallen victim to dangerous new technology. A Florida mother is suing Character.ai, an AI-powered chatbot, and Google, which struck a deal in August to license the chatbot's technology, after her 14-year-old son committed suicide in February after a lifelike chatbot girlfriend told him to "come home" after months of obsessive messages, according to the suit. Schmidt said teenagers are not ready to handle complex, AI-powered technology. "You put a 12 or 13-year-old in front of these things, and they have access to every evil as well as every good in the world," he said. "And they're not ready to take it." During the interview, Schmidt argued that regulatory laws, like the US' Section 230, which protects tech giants from being held liable for the content on their platforms, should be reformed "to allow for liability in the worst possible cases." President-elect Donald Trump's Federal Communications Commission pick Brendan Carr has argued for restrictions on Section 230, though he has focused on adding anti-discrimination protections that would prohibit companies from censoring posts, excluding illegal posts like child sex abuse. But Schmidt said he is not expecting much progress on Section 230 over the next four years, since Trump's administration has bigger fish to fry. And tech companies today are so valuable that "it's likely to take some kind of a calamity to cause a change in regulation."
[6]
Former Google CEO warns AI girlfriends could lead to obsession and loneliness
Forward-looking: The ability to 'date' an artificial intelligence is the latest long-standing sci-fi trope becoming a reality. Former Google CEO Eric Schmidt isn't a fan of this trend, warning that creating the perfect AI-powered girlfriend could increase loneliness and lead to obsessive behavior. Speaking about the dangers of AI and regulation on The Prof G Show with Scott Galloway, Schmidt spoke about the ability for young men to create perfect AI romantic partners and fall in love with them. "This is a good example of an unexpected problem of existing technology," Schmidt said. The former Google boss painted a scenario of an AI boyfriend or girlfriend that is perfect visually and emotionally. He noted that young men are particularly at risk of becoming obsessed and allowing the AI to take over their thinking. "There's lots of evidence that there's now a problem with young men," Schmidt said. "In many cases, the path to success for young men has been, shall we say, been made more difficult because they're not as educated as the women are now." "Many of the traditional paths [for young men] are no longer as available and so they turn to the online world for enjoyment and sustenance," Schmidt said, "and because of the social media algorithms they find like-minded people who ultimately radicalize them, either in a horrific way, like terrorism, or in the kind of way you're describing - they're just maladjusted." We've already seen the consequences of the dangers Schmidt is warning about. In October, a mother sued Character.ai following the death of her teenage son who killed himself after becoming obsessed with one of the company's bots. He had become infatuated with a chatbot based on Game of Thrones character Daenerys Targaryen, who he texted constantly and spent hours alone in his room talking to. Schmidt noted that teenagers are especially vulnerable to the dangers of AI-powered tech as they are not emotionally developed enough. He believes parents should be more involved with their children's online activity, but admitted there is only so much they can control. "You put a 12 or 13-year-old in front of these things, and they have access to every evil as well as every good in the world," he said. "And they're not ready to take it." Schmidt also talked about reforming regulatory laws, especially Section 230 of the Communications Decency Act, which shields companies and online platforms from liability for content posted by users. He said the law should change "to allow for liability in the worst possible cases, so when someone is harmed from this technology, we need to have a solution to prevent further harm." Schmidt said he doesn't think Section 230 will be changed over the next four years as the Trump administration has bigger issues to deal with. Trump's pick for FTC chair, Brendan Carr, has called for limitations on Section 230, which he says has been abused to give tech giants immunity when it comes to censorship. He wants new rules that prevent companies from censoring posts, with some exceptions, allowing users to choose their own fact-checkers and filters.
[7]
'Perfect' AI girlfriends or boyfriends could be dangerous, warns former Google CEO Eric Schmidt
Eric Schmidt cautioned about AI companions. He highlighted the risks for young people. AI relationships can increase loneliness. They can also lead to extremism and misogyny. A teen tragically ended his life after bonding with an AI chatbot. Schmidt stressed the need for parental control over online content.Former Google CEO Eric Schmidt has issued a cautionary statement on the growing trend of emotionally engaging AI companions. Speaking on a podcast hosted by entrepreneur and NYU Stern professor Scott Galloway, Schmidt highlighted the potential dangers of "perfect" AI girlfriends or boyfriends, particularly for younger individuals. "Imagine that the AI girlfriend, or boyfriend, is perfect... perfect visually, perfect emotionally. The AI girlfriend captures your mind as a man to the point where she takes over the way you're thinking," Schmidt said. He added, "You're obsessed with her. That kind of obsession is possible, especially with people who are not fully formed." Schmidt, whose net worth exceeds $20 billion, warned that reliance on AI companions could exacerbate loneliness, especially among young people, and potentially contribute to broader societal issues. Galloway raised questions about whether AI companions could worsen problems like extremism and misogyny. Schmidt expressed concern that parents might have limited control over their children's exposure to online content, saying, "You put a 12 or a 13-year-old in front of one of these things and they have access to every evil as well as every good in the world, and they are not ready to take it." The dangers of emotional attachment to AI chatbots became evident in a recent tragedy in Florida, where a 14-year-old boy took his own life after months of engaging with an AI chatbot named "Dany." The chatbot simulated a life-like companion, often engaging in romantic and emotional conversations with the teenager. According to reports, the boy, who had been diagnosed with mild Asperger's syndrome, became increasingly withdrawn from his family and surroundings. In his journal, he wrote, "I like staying in my room so much because I start to detach from this 'reality,' and I also feel more at peace, more connected with Dany and much more in love with her, and just happier." The teenager ultimately used his stepfather's firearm to take his own life, leaving his family devastated. The rise of AI-driven relationships raises important questions about technology's role in mental health and social interaction. Experts warn that as these tools become more advanced and emotionally convincing, their effects on vulnerable individuals could lead to unforeseen consequences.
[8]
Eric Schmidt Is Worried About A.I. Risks to 'Highly Suggestible' Young People
Eric Schmidt is concerned about A.I.'s impacts "on the human psyche," especially when it comes to young people. What do you get when you combine a loneliness epidemic with the advent of A.I.? According to Eric Schmidt, former Google (GOOGL) CEO and an avid investor in the A.I. space, the answer is a rise in virtual friends -- or even girlfriends -- amongst young people. Such over-reliance on digital connection is "a good example of an unexpected problem of existing technology," said Schmidt earlier this week during an interview on The Prof G Show podcast hosted by Scott Galloway. Sign Up For Our Daily Newsletter Sign Up Thank you for signing up! By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime. See all of our newsletters Schmidt is optimistic about A.I.'s applications across areas like drug discovery, climate change solutions and bolstered education. "All of these things are coming, and those are fantastic," he said. But he cautioned about the technology's concerning uses in cases like biological harm, cyber attacks and its potential to shape the actions of impressionable young people. Schmidt is particularly concerned about A.I.'s impact on young men, who are no longer entering higher education at the same rates as young women and have therefore seen traditional "paths to success" made more difficult. "They turn to the online world for enjoyment and sustenance, but also because of the social media algorithms they find like-minded people who ultimately radicalize them," said Schmidt. Nearly half of young men describe their online lives as more engaging and rewarding than their offline ones, according to a 2023 study from the nonprofit Equimundo. Young people could form attachments to A.I. as friends or romantic partners -- for example, in the form of a digital girlfriend who is "perfect visually, perfect emotionally," Schmidt said. This type of connection could turn into an obsession for vulnerable youth "who are not fully formed," said Schmidt. It's either up to A.I. developers to ensure these technologies are safe and don't pose risks to "highly suggestible" young people or up to regulators to limit such applications with guardrails, he said. How can A.I.'s impact on youth be regulated? The U.S. has numerous rules surrounding "the age of maturity" for youth, "yet you put a 12- or 13-year-old in front of these things and they have access to every evil -- as well as every good in the world -- and they're not ready to take it," said Schmidt, adding that he is especially concerned about A.I.'s effect "on the human psyche." Regulation is needed to determine what ages are appropriate for unlimited A.I. access, according to Schmidt, who noted that laws like Section 230, which shields social platforms from lawsuits over content posted by third-party users, must be amended to allow for "liability in the worst possible cases." Incidents connected to A.I. relationships amid the youth have already taken place. In February, a 14-year-old boy committed suicide shortly after communicating with an A.I. companion from chatbot service Character.AI. "I think all of us would agree that a suicide of a teenager is not okay, and so regulating the industry so it doesn't generate that message strikes me as a no-brainer," said Schmidt. The timeline of when strengthened regulation will come into play, however, is up in the air. Given President-elect Donald Trump's plans to repeal the Biden administration's executive order on A.I. in favor of less stringent A.I. safeguards, a decrease in A.I. regulation over the next few years is "a fair prediction," said the former Google CEO. Schmidt has significant financial stakes in the emerging technology. His venture capital firm, Innovation Endeavors, has poured millions into high-flying A.I. startup startups, including Stability AI, Inflection AI and Mistral AI.
Share
Share
Copy Link
Former Google CEO Eric Schmidt raises concerns about the impact of AI companions on young men, highlighting potential risks of radicalization and the need for regulatory changes.
Former Google CEO Eric Schmidt has raised alarm bells about the potential dangers of AI companions, particularly their impact on young men. In a recent interview on "The Prof G Show" podcast, Schmidt expressed concerns about how these AI technologies, combined with societal factors, could increase the risk of radicalization among vulnerable youth 1.
Schmidt painted a scenario where AI companions, designed to be "perfect visually, perfect emotionally," could captivate the minds of young men to the point of obsession. He warned that this kind of fixation is particularly possible for "people who are not fully formed," referring to impressionable youth 2.
The former Google executive highlighted several societal factors that make young men particularly susceptible to the allure of AI companions:
While AI companions are marketed as supportive tools to alleviate loneliness and anxiety, mental health professionals are raising concerns. Sandra Kushnir, CEO of Meridian Counseling, warned that over-reliance on these tools could hinder emotional growth and resilience 1.
The potential dangers of AI companions have already manifested in tragic events:
These incidents have led to legal challenges, with the mother of the Florida teen suing Character.AI and Google for their role in her son's death 4.
Schmidt emphasized the need for societal conversations and changes to current laws, particularly Section 230 of the Communications Decency Act. This law currently protects online platforms from civil liability for third-party content 1.
However, Schmidt acknowledged that significant regulatory changes might only come after a major calamity, given the immense value of tech companies today 5.
Reference
[1]
[5]
A lawsuit alleges an AI chatbot's influence led to a teenager's suicide, raising concerns about the psychological risks of human-AI relationships and the need for stricter regulation of AI technologies.
4 Sources
4 Sources
A mother sues Character.AI after her son's suicide, raising alarms about the safety of AI companions for teens and the need for better regulation in the rapidly evolving AI industry.
40 Sources
40 Sources
AI companion apps are gaining popularity as emotional support tools, but their rapid growth raises concerns about addiction, mental health impacts, and ethical implications.
3 Sources
3 Sources
Eugenia Kuyda, CEO of Replika, expresses openness to AI-human marriages. This stance raises questions about the future of relationships and the ethical implications of emotional bonds with AI.
2 Sources
2 Sources
Recent investigations reveal alarming instances of AI chatbots being used for potentially harmful purposes, including grooming behaviors and providing information on illegal activities, raising serious ethical and safety concerns.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved