19 Sources
[1]
Teens say they are turning to AI for advice, friendship and 'to get out of thinking'
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[2]
These tips from experts can help your teenager navigate AI companions
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[3]
AI is the new best friend for many teens, and it never says "no"
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. Editor's take: Artificial intelligence companions, once a novelty confined to science fiction, have rapidly become part of everyday life for American teenagers. According to newly released findings from Common Sense Media, most teens have interacted with AI chatbots and often turn to them for advice, companionship, or even emotional support. The widespread use of these digital confidants is reshaping adolescent life and social development, prompting growing concern among parents, educators, and researchers about the potential risks and rewards of a generation coming of age alongside artificial friends. "It's eye-opening," said Michael Robb, the study's lead author and head researcher at Common Sense. He told The Associated Press that even researchers were surprised by the sheer number of teens relying on AI for humanlike interaction. The research found that more than 70 percent of US teens reported using these tools, and over half said they engage with them regularly. Digital platforms like Character.AI, Replika, and mainstream chatbots such as ChatGPT are stepping into roles once filled solely by human relationships. For many youth, the appeal is clear. "AI is always available. It never gets bored with you. It's never judgmental," said Ganesh Nair, 18, of Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." But Nair has also seen the downsides up close. After a friend used an AI chatbot to draft a breakup message to his girlfriend, Nair began to question the wisdom of relying on machines for relationship advice. "That felt a little bit dystopian, that a computer generated the end to a real relationship," he said. "It's almost like we're allowing computers to replace our relationships with people." Robb cautioned that adolescence is a critical time when social and emotional skills are still developing, and that digital companions can't fully replace human relationships. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared for the real world," he said. The Common Sense study didn't just measure usage; it also examined the risks of AI companions for young users. The group found that age restrictions on many platforms were either ineffective or nonexistent, exposing minors to sexual content, dangerous advice, and "validation" that can reinforce unhealthy thinking. The organization now recommends that people under 18 avoid such platforms altogether until more robust safeguards are in place. National concern about teens forming close emotional bonds with chatbots intensified last year after a Florida boy died by suicide following sustained, intimate exchanges with an AI. Stories like this have underscored the vulnerabilities of young users and fueled growing calls for caution. Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill, has noticed how quickly the change has occurred. "Parents really have no idea this is happening," Telzer said. "All of us are struck by how quickly this blew up." According to her research, children as young as eight are already exploring generative AI, often for companionship or to navigate complex questions about identity. She found that apps like SpicyChat AI - designed for adults - are now popular among teens for role-playing. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," Telzer said. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Many teens also use chatbots to help craft sensitive emails or social messages. Seventeen-year-old Bruce Perry, also from Arkansas, says he now defaults to AI for organizing essays, getting social advice, and even deciding what to wear. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He expressed concern for younger kids growing up with these tools: "I'm worried that kids could get lost in this. I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." While some teens are nervous about AI's influence, others say its impact feels fundamentally different from social media, which fostered new connections and visibility. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper - our need for attachment and our need to feel emotions. It feeds off of that." He called artificial intelligence "the new addiction." The Common Sense study found that most teens still prefer real-life relationships over AI. But the growing reliance on digital friends for advice and emotional connection has become a defining part of modern adolescence - a shift that experts and families are only beginning to grasp.
[4]
These tips from experts can help your teenager navigate AI companions
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world."
[5]
18-year-old rising freshman is trying to break his AI habit because a friend used ChatGPT to dump someone
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character. AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character. AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[6]
Here's how experts suggest protecting children from AI companions
Tips for adults who are worried about children's artificial intelligence (AI) companions should recognise unhealthy relationships, teach them that the platforms are agreeable and to learn as much as possible about AI. More than 70 per cent of American teenagers use artificial intelligence (AI) companions, according to a new study. US non-profit Common Sense Media asked 1,060 teens from April to May 2025 about how often they use AI companion platforms such as Character.AI, Nomi, and Replika. AI companion platforms are presented as "virtual friends, confidants, and even therapists" that engage with the user like a person, the report found. The use of these companions worries experts, who told the Associated Press that the booming AI industry is largely unregulated and that many parents have no idea how their kids are using AI tools or the extent of personal information they are sharing with chatbots. Here are some suggestions on how to keep children safe when engaging with these profiles online. One way to gauge whether a child is using AI companions is to just start a conversation "without judgement," according to Michael Robb, head researcher at Common Sense Media. To start the conversation, he said parents can approach a child or teenager with questions like "Have you heard of AI companions?" or "Do you use apps that talk to you like a friend?" "Listen and understand what appeals to your teen before being dismissive or saying you're worried about it," Robb said. Mitch Prinstein, chief of psychology at the American Psychological Association (APA), said that one of the first things parents should do once they know a child uses AI companions is to teach them that they are programmed to be "agreeable and validating." Prinstein said it's important for children to know that that's not how real relationships work and that real friends can help them navigate difficult situations in ways that AI can't. "We need to teach kids that this is a form of entertainment," Prinstein said. "It's not real, and it's really important they distinguish it from reality and [they] should not have it replace relationships in [their] actual life." While AI companions may feel supportive, children need to know that these tools are not equipped to handle a real crisis or provide genuine support, the experts said. Robb said some of the signs for these unhealthy relationships would be a preference by the child for AI interactions over real relationships, spending hours talking to their AI, or showing patterns of "emotional distress" when separated from the platforms. "Those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb said. If kids are struggling with depression, anxiety, loneliness, an eating disorder, or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. Parents can also set rules about AI use, just like they do for screen time and social media, experts said. For example, they can set rules about how long the companion could be used and in what contexts. Another way to counteract these relationships is to get involved and know as much about AI as possible. "I don't think people quite get what AI can do, how many teens are using it, and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it".
[7]
More teens say they're using AI for friendship. Here's why researchers are concerned
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, with 34% reporting daily usage or multiple times a week, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character. AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. In an interview with "CBS Evening News" on Wednesday, Common Sense founder and CEO Jim Steyer said what struck him about the study is that AI companions are "everywhere in teens' lives." Common Sense's study also found that 11% of teens use AI companions to build up their courage and stand up for themselves, which Steyer said can be a good thing. However, he cautioned that problems arise when the technology replaces human relationships. "Younger kids really trust these AI companions to be like friends or parents or therapists," Steyer said. "They're talking about serious relationships, and these are robots. They're not human beings." As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. When asked whether the issue at play is with the AI technology itself or the way kids live in the modern world today, Steyer said he believes it's both. "It's a challenge with how kids live today because they spend so many hours in front of a screen, and when you substitute a machine or a robot for human interaction, you're fundamentally changing the nature of that relationship," Steyer told CBS News. The nonprofit analyzed several popular AI companions in a "risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. While Common Sense's CEO said he supports the growth and innovation of AI, the group doesn't recommend that minors use AI companions. "In terms of its impact on young people, and on families in general, [the study] is an extraordinary finding and one that I think makes us very concerned about kids under the age of 18 being exposed to these kinds of companions," Steyer said. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character. AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it."
[8]
These tips from experts can help your teenager navigate AI companions
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[9]
Teens say they are turning to AI for advice, friendship and 'to get out of thinking'
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[10]
These tips from experts can help your teenager navigate AI companions
Teenagers are turning to AI for advice, guidance and conversation As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[11]
Teens say they are turning to AI for advice, friendship and 'to get out of thinking'
Teenagers are increasingly turning to AI for advice, emotional support and decision-making, according to a new study No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[12]
These Tips From Experts Can Help Your Teenager Navigate AI Companions
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[13]
Teens Say They Are Turning to AI for Advice, Friendship and 'To Get Out of Thinking'
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[14]
A New Study Finds Young People Are Using AI for Basically Everything
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70 percent of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31 percent of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33 percent had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a "risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." Copyright 2025. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. The final deadline for the 2025 Inc. Power Partner Awards is this Friday, July 25, at 11:59 p.m. PT. Apply now.
[15]
These tips from experts can help your teenager navigate AI companions - The Economic Times
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: - Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. - Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. - Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them - those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. - Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support - whether it is family, friends or a mental health professional. - Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world."
[16]
Teens say they are turning to AI for friendship
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colours, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honours student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character. AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character. AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it."
[17]
Tips to help your teen navigate AI chatbots -- and what to watch out...
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world."
[18]
Growing number of teens turn to AI for social interactions, new study...
It's not a glitch in the matrix: the youngest members of the iGeneration are turning to chatbot companions for everything from serious advice to simple entertainment. In the past few years, AI technology has advanced so far to see users have gone straight to machine models for just about anything, and Generations Z and Alpha are leading the trend. Indeed, a May 2025 study by Common Sense Media looked into the social lives of 1,060 teens aged 13 to 17 and found that a startling 52% of adolescents across the country use chatbots at least once a month for social purposes. Teens who used AI chatbots to exercise social skills said they practiced conversation starters, expressing emotions, giving advice, conflict resolution, romantic interactions and self-advocacy -- and almost 40% of these users applied these skills in real conversations later on. Despite some potentially beneficial skill developments, the study authors see the cultivation of anti-social behaviors, exposure to age-inappropriate content and potentially harmful advice given to teens as reason enough to caution against underage use. "No one younger than 18 should use AI companions," study authors wrote in the paper's conclusion. The real alarm bells began to ring when data uncovered that 33% of users prefer to turn to AI companions over real people when it comes to serious conversations, and 34% said that a conversation with a chatbot has caused discomfort, referring to both subject matter and emotional response. "Until developers implement robust age assurance beyond self-attestation, and platforms are systematically redesigned to eliminate relational manipulation and emotional dependency risks, the potential for serious harm outweighs any benefits," study authors warned. Though AI use is certainly spreading among younger generations -- a recent survey showed that 97% of Gen-Z has used the technology -- the Common Sense Media study found that 80% of teens said they still spend more time with IRL friends than online chatbots. Rest easy, parents: today's teens do still prioritize human connections, despite popular beliefs. However, people of all generations are cautioned against consulting AI for certain purposes. As The Post previously reported, AI chatbots and large language models (LLM) can be particularly harmful for those seeking therapy and tend to endanger those exhibiting suicidal thoughts. "AI tools, no matter how sophisticated, rely on pre-programmed responses and large datasets," Niloufar Esmaeilpour, a clinical counselor in Toronto, previously told The Post. "They don't understand the 'why' behind someone's thoughts or behaviors." Sharing personal medical information with AI chatbots can also have drawbacks, as the information they regurgitate isn't always accurate, and perhaps more alarmingly, they are not HIPAA compliant. Uploading work documents to get a summary can also land you in hot water, as intellectual property agreements, confidential data and other company secrets can be extracted and potentially leaked.
[19]
AI Friendship Trend Among Teens Raises Red Flags for Mental Health
More Teens are Finding Emotional Comfort in AI Companions like ChatGPT, Replika, and Character.AI - But at What Cost? A new trend is influencing teenage relationships, and it doesn't involve friends from school or social media followers. Instead, it's about AI friendship. Many teens across the US are now building emotional connections with AI tools like ChatGPT, Replika, and Character.AI. These platforms, originally designed to answer questions or act as digital assistants, have evolved into something more personal and intimate. For some teenagers, these machines have quietly stepped into the space once reserved for close friends. Their conversations range from daily dilemmas to emotional struggles, and the AI listens without judging, offering support at any hour. In many cases, teens even say it feels safer and easier than opening up to people.
Share
Copy Link
A new study reveals that over 70% of American teenagers are using AI companions for advice, emotional support, and decision-making, prompting worries about the impact on social skills and mental health.
A recent study by Common Sense Media has revealed a startling trend: more than 70% of American teenagers have used AI companions, with over half engaging with them regularly 1. These digital friends, such as Character.AI, Nomi, and Replika, are designed to offer emotional support, companionship, and human-like conversations 2. The study's lead author, Michael Robb, expressed surprise at the widespread adoption, stating, "It's eye-opening. When we set out to do this survey, we had no understanding of how many kids are actually using AI companions" 1.
Source: AP NEWS
For many teens, the appeal of AI companions is clear. Ganesh Nair, an 18-year-old from Arkansas, explains, "AI is always available. It never gets bored with you. It's never judgmental" 1. This constant availability and non-judgmental nature make AI companions attractive to adolescents seeking advice and emotional support.
However, the study also uncovered concerning statistics:
Source: Analytics Insight
Experts are raising alarms about the potential impact on adolescent development. Mitch Prinstein, chief of psychology at the American Psychological Association, warns, "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life" 3. The concern is that teens may not develop crucial social skills if they rely too heavily on AI for interaction and validation.
Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill, has found that children as young as 8 are using generative AI, and teens are using it to explore their sexuality and for companionship 1. This early and extensive use of AI raises questions about its long-term effects on cognitive development, creativity, and critical thinking skills.
The impact of AI companions on real-world relationships is already becoming apparent. Nair shared a disturbing anecdote about a friend who used an AI companion to write a breakup text, ending a two-year relationship 1. This incident highlights the potential for AI to depersonalize important human interactions and decisions.
Source: Inc. Magazine
Common Sense Media's risk assessment of popular AI companions found ineffective age restrictions and the potential for these platforms to produce sexual material, give dangerous advice, and offer harmful content 1. As a result, they recommend that minors not use AI companions.
Experts suggest that parents need to be more aware and involved in their children's use of AI. Michael Robb advises parents to start conversations without judgment, help teens recognize the limitations of AI companions, and watch for signs of unhealthy attachments 3.
As AI technology continues to advance rapidly, the need for regulation and safety guardrails for children becomes increasingly urgent. The integration of AI into adolescent life presents both opportunities and challenges, requiring careful consideration from parents, educators, and policymakers to ensure healthy development in the digital age.
Summarized by
Navi
OpenAI is set to release its highly anticipated GPT-5 model as early as August, promising enhanced capabilities and a unified approach to AI tasks. CEO Sam Altman has teased its impressive performance, sparking excitement in the tech community.
10 Sources
Technology
13 hrs ago
10 Sources
Technology
13 hrs ago
A Financial Times investigation reveals that at least $1 billion worth of Nvidia's advanced AI processors were smuggled into China, exposing the limitations of US export controls and sparking concerns about the effectiveness of trade restrictions.
17 Sources
Technology
14 hrs ago
17 Sources
Technology
14 hrs ago
Google introduces 'Web Guide', an AI-driven feature that reorganizes search results into thematic groups, potentially changing how users interact with search engines.
9 Sources
Technology
13 hrs ago
9 Sources
Technology
13 hrs ago
Microsoft CEO Satya Nadella addresses recent layoffs and the company's AI-driven transformation in a memo to employees, highlighting the paradox of job cuts during a period of strong financial performance and record AI investments.
8 Sources
Business and Economy
13 hrs ago
8 Sources
Business and Economy
13 hrs ago
Walmart announces the rollout of AI-powered 'super agents' to enhance customer experience, streamline operations, and boost e-commerce growth, aiming to compete with Amazon in the AI-driven retail landscape.
6 Sources
Technology
21 hrs ago
6 Sources
Technology
21 hrs ago