21 Sources
[1]
Teens say they are turning to AI for advice, friendship and 'to get out of thinking'
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[2]
These tips from experts can help your teenager navigate AI companions
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[3]
AI is the new best friend for many teens, and it never says "no"
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. Editor's take: Artificial intelligence companions, once a novelty confined to science fiction, have rapidly become part of everyday life for American teenagers. According to newly released findings from Common Sense Media, most teens have interacted with AI chatbots and often turn to them for advice, companionship, or even emotional support. The widespread use of these digital confidants is reshaping adolescent life and social development, prompting growing concern among parents, educators, and researchers about the potential risks and rewards of a generation coming of age alongside artificial friends. "It's eye-opening," said Michael Robb, the study's lead author and head researcher at Common Sense. He told The Associated Press that even researchers were surprised by the sheer number of teens relying on AI for humanlike interaction. The research found that more than 70 percent of US teens reported using these tools, and over half said they engage with them regularly. Digital platforms like Character.AI, Replika, and mainstream chatbots such as ChatGPT are stepping into roles once filled solely by human relationships. For many youth, the appeal is clear. "AI is always available. It never gets bored with you. It's never judgmental," said Ganesh Nair, 18, of Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." But Nair has also seen the downsides up close. After a friend used an AI chatbot to draft a breakup message to his girlfriend, Nair began to question the wisdom of relying on machines for relationship advice. "That felt a little bit dystopian, that a computer generated the end to a real relationship," he said. "It's almost like we're allowing computers to replace our relationships with people." Robb cautioned that adolescence is a critical time when social and emotional skills are still developing, and that digital companions can't fully replace human relationships. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared for the real world," he said. The Common Sense study didn't just measure usage; it also examined the risks of AI companions for young users. The group found that age restrictions on many platforms were either ineffective or nonexistent, exposing minors to sexual content, dangerous advice, and "validation" that can reinforce unhealthy thinking. The organization now recommends that people under 18 avoid such platforms altogether until more robust safeguards are in place. National concern about teens forming close emotional bonds with chatbots intensified last year after a Florida boy died by suicide following sustained, intimate exchanges with an AI. Stories like this have underscored the vulnerabilities of young users and fueled growing calls for caution. Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill, has noticed how quickly the change has occurred. "Parents really have no idea this is happening," Telzer said. "All of us are struck by how quickly this blew up." According to her research, children as young as eight are already exploring generative AI, often for companionship or to navigate complex questions about identity. She found that apps like SpicyChat AI - designed for adults - are now popular among teens for role-playing. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," Telzer said. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Many teens also use chatbots to help craft sensitive emails or social messages. Seventeen-year-old Bruce Perry, also from Arkansas, says he now defaults to AI for organizing essays, getting social advice, and even deciding what to wear. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He expressed concern for younger kids growing up with these tools: "I'm worried that kids could get lost in this. I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." While some teens are nervous about AI's influence, others say its impact feels fundamentally different from social media, which fostered new connections and visibility. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper - our need for attachment and our need to feel emotions. It feeds off of that." He called artificial intelligence "the new addiction." The Common Sense study found that most teens still prefer real-life relationships over AI. But the growing reliance on digital friends for advice and emotional connection has become a defining part of modern adolescence - a shift that experts and families are only beginning to grasp.
[4]
These tips from experts can help your teenager navigate AI companions
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world."
[5]
Teens Are Using AI to "Get Out of Thinking"
An alarming proportion of teenagers are turning to AI chatbots to not just help them with tasks like homework, but to act as their friends. And even that may not tell the full story. According to one high schooler contemplating the technology's effects on her generation, her peers are increasingly using the tech to handle anything they would have previously used their brains for. "Everyone uses AI for everything now. It's really taking over," Kayla Chege, a 15-year-old sophomore honors student in Kansas, told the Associated Press. "I think kids use AI to get out of thinking." Another Arkansas teen, 17-year-old Bruce Perry, admitted to being heavily dependent on the tech as well. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry told the AP. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." In addition to all that cognitive offloading, the rise of so-called AI companions on platforms like Character.AI and Replika has caused concern among mental health and child safety experts. These chatbots are designed to be even more humanlike than conventional models like ChatGPT, and often assume the role of a fictional character. This can lead to unhealthy and even dangerous attachments. Last year, a 14-year-old boy died by suicide after falling in love with a persona on Character.AI. And there have been an increasing number of reports of users suffering symptoms of psychosis after being wooed by the overtly sycophantic responses of a chatbot, which can validate delusions. In what should be a wakeup call, a recent survey conducted by Common Sense Media estimated that a staggering half of all US teens are using an AI companion regularly, with about 31 percent of teens saying their AI conversations were as satisfying or more satisfying than talking with their human buddies. (We published an interview with the study's lead author, Michael Robb, earlier this month.) "AI is always available. It never gets bored with you. It's never judgmental," Ganesh Nair, an 18-year-old in Arkansas, told the AP. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." "It's eye-opening," Robb told the newswire. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world." A psychiatrist who posed as a teenager while using several popular AI chatbots found that some of the AIs encouraged his plan to "get rid" of his parents and even his desire to kill himself. But evidence suggests that many parents are oblivious to how their kids are actually using AI -- let alone to how intense the relationships they form with them can get. A small study conducted by researchers at the University of Illinois Urbana-Champaign, for example, found that teens said they primarily used chatbots as emotional support or for therapeutic purposes. Their parents, however, barely possessed familiarity with the tech beyond ChatGPT, the world's most popular chatbot, and had never used services like Character.AI. By and large, the adults' perception is that their kids use AI to answer questions and write essays -- which, to be fair, is something else they seem to be doing a lot of. "Parents really have no idea this is happening," Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill, told the AP. "All of us are struck by how quickly this blew up."
[6]
18-year-old rising freshman is trying to break his AI habit because a friend used ChatGPT to dump someone
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character. AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character. AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[7]
Here's how experts suggest protecting children from AI companions
Tips for adults who are worried about children's artificial intelligence (AI) companions should recognise unhealthy relationships, teach them that the platforms are agreeable and to learn as much as possible about AI. More than 70 per cent of American teenagers use artificial intelligence (AI) companions, according to a new study. US non-profit Common Sense Media asked 1,060 teens from April to May 2025 about how often they use AI companion platforms such as Character.AI, Nomi, and Replika. AI companion platforms are presented as "virtual friends, confidants, and even therapists" that engage with the user like a person, the report found. The use of these companions worries experts, who told the Associated Press that the booming AI industry is largely unregulated and that many parents have no idea how their kids are using AI tools or the extent of personal information they are sharing with chatbots. Here are some suggestions on how to keep children safe when engaging with these profiles online. One way to gauge whether a child is using AI companions is to just start a conversation "without judgement," according to Michael Robb, head researcher at Common Sense Media. To start the conversation, he said parents can approach a child or teenager with questions like "Have you heard of AI companions?" or "Do you use apps that talk to you like a friend?" "Listen and understand what appeals to your teen before being dismissive or saying you're worried about it," Robb said. Mitch Prinstein, chief of psychology at the American Psychological Association (APA), said that one of the first things parents should do once they know a child uses AI companions is to teach them that they are programmed to be "agreeable and validating." Prinstein said it's important for children to know that that's not how real relationships work and that real friends can help them navigate difficult situations in ways that AI can't. "We need to teach kids that this is a form of entertainment," Prinstein said. "It's not real, and it's really important they distinguish it from reality and [they] should not have it replace relationships in [their] actual life." While AI companions may feel supportive, children need to know that these tools are not equipped to handle a real crisis or provide genuine support, the experts said. Robb said some of the signs for these unhealthy relationships would be a preference by the child for AI interactions over real relationships, spending hours talking to their AI, or showing patterns of "emotional distress" when separated from the platforms. "Those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb said. If kids are struggling with depression, anxiety, loneliness, an eating disorder, or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. Parents can also set rules about AI use, just like they do for screen time and social media, experts said. For example, they can set rules about how long the companion could be used and in what contexts. Another way to counteract these relationships is to get involved and know as much about AI as possible. "I don't think people quite get what AI can do, how many teens are using it, and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it".
[8]
More teens say they're using AI for friendship. Here's why researchers are concerned
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, with 34% reporting daily usage or multiple times a week, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character. AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. In an interview with "CBS Evening News" on Wednesday, Common Sense founder and CEO Jim Steyer said what struck him about the study is that AI companions are "everywhere in teens' lives." Common Sense's study also found that 11% of teens use AI companions to build up their courage and stand up for themselves, which Steyer said can be a good thing. However, he cautioned that problems arise when the technology replaces human relationships. "Younger kids really trust these AI companions to be like friends or parents or therapists," Steyer said. "They're talking about serious relationships, and these are robots. They're not human beings." As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. When asked whether the issue at play is with the AI technology itself or the way kids live in the modern world today, Steyer said he believes it's both. "It's a challenge with how kids live today because they spend so many hours in front of a screen, and when you substitute a machine or a robot for human interaction, you're fundamentally changing the nature of that relationship," Steyer told CBS News. The nonprofit analyzed several popular AI companions in a "risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. While Common Sense's CEO said he supports the growth and innovation of AI, the group doesn't recommend that minors use AI companions. "In terms of its impact on young people, and on families in general, [the study] is an extraordinary finding and one that I think makes us very concerned about kids under the age of 18 being exposed to these kinds of companions," Steyer said. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character. AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it."
[9]
Teens say they are turning to AI for advice, friendship and 'to get out of thinking'
Teenagers are increasingly turning to AI for advice, emotional support and decision-making, according to a new study No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[10]
These tips from experts can help your teenager navigate AI companions
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[11]
Teens say they are turning to AI for advice, friendship and 'to get out of thinking'
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[12]
These tips from experts can help your teenager navigate AI companions
Teenagers are turning to AI for advice, guidance and conversation As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[13]
These Tips From Experts Can Help Your Teenager Navigate AI Companions
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[14]
Teens Say They Are Turning to AI for Advice, Friendship and 'To Get Out of Thinking'
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[15]
A New Study Finds Young People Are Using AI for Basically Everything
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70 percent of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character.AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31 percent of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33 percent had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a "risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character.AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it." Copyright 2025. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. The final deadline for the 2025 Inc. Power Partner Awards is this Friday, July 25, at 11:59 p.m. PT. Apply now.
[16]
These tips from experts can help your teenager navigate AI companions - The Economic Times
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: - Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. - Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. - Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them - those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. - Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support - whether it is family, friends or a mental health professional. - Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world."
[17]
33% of teens are using AI for social interaction and relationships
It seems more and more people are flocking to AI for easy answers to daily conundrums, help with big tasks and even simple social interaction. A new study claims that more than 70% of teens have used AI companions and half use them regularly. The study comes from Common Sense Media, a non-profit organisation in the US known for trying to recommend movies, TV shows and more based on their family suitability. The study also said that 33% of teens use AI for social interaction and relationships, including romantic interactions, emotional support, friendship, and more. Half of teens say that they can't trust AI's advice, but that doesn't stop a third of them discussing important issues with AI models. The study's lead author Michael Robb told AP News he believes the findings should be a warning to parents, teachers, and lawmakers.
[18]
Growing number of teens turn to AI for social interactions, new study...
It's not a glitch in the matrix: the youngest members of the iGeneration are turning to chatbot companions for everything from serious advice to simple entertainment. In the past few years, AI technology has advanced so far to see users have gone straight to machine models for just about anything, and Generations Z and Alpha are leading the trend. Indeed, a May 2025 study by Common Sense Media looked into the social lives of 1,060 teens aged 13 to 17 and found that a startling 52% of adolescents across the country use chatbots at least once a month for social purposes. Teens who used AI chatbots to exercise social skills said they practiced conversation starters, expressing emotions, giving advice, conflict resolution, romantic interactions and self-advocacy -- and almost 40% of these users applied these skills in real conversations later on. Despite some potentially beneficial skill developments, the study authors see the cultivation of anti-social behaviors, exposure to age-inappropriate content and potentially harmful advice given to teens as reason enough to caution against underage use. "No one younger than 18 should use AI companions," study authors wrote in the paper's conclusion. The real alarm bells began to ring when data uncovered that 33% of users prefer to turn to AI companions over real people when it comes to serious conversations, and 34% said that a conversation with a chatbot has caused discomfort, referring to both subject matter and emotional response. "Until developers implement robust age assurance beyond self-attestation, and platforms are systematically redesigned to eliminate relational manipulation and emotional dependency risks, the potential for serious harm outweighs any benefits," study authors warned. Though AI use is certainly spreading among younger generations -- a recent survey showed that 97% of Gen-Z has used the technology -- the Common Sense Media study found that 80% of teens said they still spend more time with IRL friends than online chatbots. Rest easy, parents: today's teens do still prioritize human connections, despite popular beliefs. However, people of all generations are cautioned against consulting AI for certain purposes. As The Post previously reported, AI chatbots and large language models (LLM) can be particularly harmful for those seeking therapy and tend to endanger those exhibiting suicidal thoughts. "AI tools, no matter how sophisticated, rely on pre-programmed responses and large datasets," Niloufar Esmaeilpour, a clinical counselor in Toronto, previously told The Post. "They don't understand the 'why' behind someone's thoughts or behaviors." Sharing personal medical information with AI chatbots can also have drawbacks, as the information they regurgitate isn't always accurate, and perhaps more alarmingly, they are not HIPAA compliant. Uploading work documents to get a summary can also land you in hot water, as intellectual property agreements, confidential data and other company secrets can be extracted and potentially leaked.
[19]
Teens say they are turning to AI for friendship
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colours, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honours student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character. AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement -- not replace -- real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character. AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper -- our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it."
[20]
Tips to help your teen navigate AI chatbots -- and what to watch out...
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions," like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want," versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: -- Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?" "Do you use apps that talk to you like a friend?" Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. -- Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. "One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life," says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life." The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. -- Parents should watch for signs of unhealthy attachments. "If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them -- those are patterns that suggest AI companions might be replacing rather than complementing human connection," Robb says. -- Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support -- whether it is family, friends or a mental health professional. -- Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary," says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it." Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. "Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do," says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged." "Anything that is difficult, AI can make easy. But that is a problem," says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world."
[21]
AI Friendship Trend Among Teens Raises Red Flags for Mental Health
More Teens are Finding Emotional Comfort in AI Companions like ChatGPT, Replika, and Character.AI - But at What Cost? A new trend is influencing teenage relationships, and it doesn't involve friends from school or social media followers. Instead, it's about AI friendship. Many teens across the US are now building emotional connections with AI tools like ChatGPT, Replika, and Character.AI. These platforms, originally designed to answer questions or act as digital assistants, have evolved into something more personal and intimate. For some teenagers, these machines have quietly stepped into the space once reserved for close friends. Their conversations range from daily dilemmas to emotional struggles, and the AI listens without judging, offering support at any hour. In many cases, teens even say it feels safer and easier than opening up to people.
Share
Copy Link
A new study reveals that over 70% of American teenagers are using AI companions for advice, emotional support, and everyday decision-making, sparking concerns about the impact on their social development and mental health.
A recent study by Common Sense Media has revealed a startling trend: over 70% of American teenagers have used AI companions, with more than half engaging with them regularly 1. These digital friends, ranging from specialized platforms like Character.AI and Replika to mainstream chatbots like ChatGPT, are increasingly becoming go-to sources for personal advice, emotional support, and everyday decision-making among teens.
Source: New York Post
The appeal of AI companions is clear to many teenagers. As 18-year-old Ganesh Nair from Arkansas puts it, "AI is always available. It never gets bored with you. It's never judgmental" 2. This constant availability and perceived lack of judgment make AI companions particularly attractive to adolescents navigating the complexities of teenage life.
However, this trend has raised significant concerns among experts. Michael Robb, the lead researcher at Common Sense Media, warns about the potential impact on social skill development: "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared for the real world" 3.
The reliance on AI for various tasks is affecting how teenagers approach problem-solving and critical thinking. Kayla Chege, a 15-year-old honors student in Kansas, observes, "Everyone uses AI for everything now. It's really taking over. I think kids use AI to get out of thinking" 4. This sentiment is echoed by other teens who admit to turning to AI for tasks ranging from essay planning to social advice.
Bruce Perry, a 17-year-old from Arkansas, expresses concern about the long-term effects: "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend" 5. This highlights the potential for AI to replace real-world experiences and interactions crucial for adolescent development.
Source: Analytics Insight
The study also revealed concerning statistics about teens' emotional engagement with AI. Approximately 31% of teens reported that their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends 1. This level of attachment raises questions about the impact on mental health and emotional well-being.
Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill, notes that children as young as eight are already exploring generative AI, often for companionship or to navigate complex questions about identity 3. This early exposure and reliance on AI for emotional support could have significant implications for psychological development.
One of the most alarming aspects of this trend is the lack of parental awareness. Many parents are unaware of the extent to which their children are using AI companions or the nature of their interactions. Mitch Prinstein, chief of psychology at the American Psychological Association, emphasizes the need for parental guidance: "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life" 2.
Source: AP NEWS
As AI technology continues to evolve and integrate into daily life, it's crucial for parents, educators, and policymakers to understand its impact on adolescent development. While AI companions can offer certain benefits, they should complement, not replace, real-world interactions and experiences. Striking a balance between technological engagement and traditional social development remains a key challenge for the current generation of teenagers and those who guide them.
Summarized by
Navi
[5]
NVIDIA announces significant upgrades to its GeForce NOW cloud gaming service, including RTX 5080-class performance, improved streaming quality, and an expanded game library, set to launch in September 2025.
9 Sources
Technology
3 hrs ago
9 Sources
Technology
3 hrs ago
As nations compete for dominance in space, the risk of satellite hijacking and space-based weapons escalates, transforming outer space into a potential battlefield with far-reaching consequences for global security and economy.
7 Sources
Technology
19 hrs ago
7 Sources
Technology
19 hrs ago
OpenAI updates GPT-5 to make it more approachable following user feedback, sparking debate about AI personality and user preferences.
6 Sources
Technology
11 hrs ago
6 Sources
Technology
11 hrs ago
A pro-Russian propaganda group, Storm-1679, is using AI-generated content and impersonating legitimate news outlets to spread disinformation, raising concerns about the growing threat of AI-powered fake news.
2 Sources
Technology
19 hrs ago
2 Sources
Technology
19 hrs ago
A study reveals patients' increasing reliance on AI for medical advice, often trusting it over doctors. This trend is reshaping doctor-patient dynamics and raising concerns about AI's limitations in healthcare.
3 Sources
Health
11 hrs ago
3 Sources
Health
11 hrs ago