Curated by THEOUTPOST
On Wed, 12 Mar, 5:44 PM UTC
6 Sources
[1]
Student privacy vs. safety: The AI surveillance dilemma in WA schools
One student asked a search engine, "Why does my boyfriend hit me?" Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves. In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Southwest Washington. Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings. The goal is to keep children safe, but these tools raise serious questions about privacy and security -- as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district's surveillance technology. The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives. Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots. Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn't protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk. The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology's unintended consequences in American schools. In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe. Gaggle Safety Management, the developer of the software that tracks Vancouver schools students' online activity, believes not monitoring children is like letting them loose on "a digital playground without fences or recess monitors," CEO and founder Jeff Patterson said. Roughly 1,500 school districts nationwide use Gaggle's software to track the online activity of approximately 6 million students. It's one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance. The technology has been in high demand since the pandemic, when nearly every child received a school-issued tablet or laptop. According to a U.S. Senate investigation, over 7,000 schools or districts used GoGuardian's surveillance products in 2021. Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students' well-being. "I don't think we could ever put a price on protecting students," said Andy Meyer, principal of Vancouver's Skyview High School. "Anytime we learn of something like that and we can intervene, we feel that is very positive." Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations. "That's not good at all," Foster said after learning the district inadvertently released the records. "But what are my options? What do I do? Pull my kid out of school?" Foster says she'd be upset if her daughter's private information was compromised. "At the same time," she said, "I would like to avoid a school shooting or suicide." How does student surveillance work? Gaggle uses a machine learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years -- approximately the cost of employing one extra counselor. The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check. A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately. "A lot of times, families don't know. We open that door for that help," the counselor said. Gaggle is "good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged." Seattle Times and AP reporters saw what kind of writing set off Gaggle's alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren't protected by a password. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so emergency contacts -- who often receive these alerts late at night on their phones -- can respond quickly. In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While some alerts were serious, many turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends. Foster's daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal's office after writing a short story featuring a scene with mildly violent imagery. "I'm glad they're being safe about it, but I also think it can be a bit much," Bryn said. School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly. "It allows me the opportunity to meet with a student I maybe haven't met before and build that relationship," said Chele Pierce, a Skyview High School counselor. Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district's enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, about 1 in 4 students had communications that triggered a Gaggle alert. While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There's no independent research showing it measurably lowers student suicide rates or reduces violence. A 2023 Rand study found only "scant evidence" of either benefits or risks from AI surveillance, concluding: "No research to date has comprehensively examined how these programs affect youth suicide prevention." "If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention," said report co-author Benjamin Boudreaux, an AI ethics researcher. LGBTQ+ students are most vulnerable In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, trans or struggling with gender dysphoria. LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support. "We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver," said Katy Pearce, a University of Washington professor who researches technology in authoritarian states. In one screenshot, a Vancouver high schooler wrote in a Google survey form they'd been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: "I am not a mandated reporter, please tell me the whole truth." When North Carolina's Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful. But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive. Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then "blindsided" when Gaggle alerted school officials about something private they'd disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle. "You can't just (surveil) people and not tell them. That's a horrible breach of security and trust," said Thompson, now a college student, in an interview. After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults. Parents don't really know The debate over privacy and security is complicated, and parents are often unaware it's even an issue. Pearce, the University of Washington professor, doesn't remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district's responsible use form before her son received a school laptop. Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class. For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns. The district refused Reiland's request. When his daughter, Zoe, found out about Gaggle, she says she felt so "freaked out" that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn't want to get called into the office for "searching up lady parts." "I was too scared to be curious," she said. School officials say they don't track metrics measuring the technology's efficacy but believe it has saved lives. Yet technology alone doesn't create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with "deliberate indifference" to some families' reports of sexual harassment, mainly in the form of homophobic bullying. During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide. When asked why bullying remained a problem despite surveillance, Russell Thornton, the district's executive director of technology responded: "This is one tool used by administrators. Obviously, one tool is not going to solve the world's problems and bullying." Long-term effects unknown Despite the risks, surveillance technology can help teachers intervene before a tragedy. A middle school student in the Seattle-area Highline Public Schools who was potentially being trafficked used Gaggle to communicate with campus staff, said former superintendent Susan Enfield. "They knew that the staff member was reading what they were writing," Enfield said. "It was, in essence, that student's way of asking for help." Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support. "The idea that kids are constantly under surveillance by adults -- I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in," said Boudreaux, the AI ethics researcher. Gaggle's Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, "the school's going to be held liable," he said. "If you're looking for that open free expression, it really can't happen on the school system's computers."
[2]
Schools use AI to monitor kids, hoping to prevent violence. Our investigation found security risks
One student asked a search engine, "Why does my boyfriend hit me?" Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves. In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state. Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings. The goal is to keep children safe, but these tools raise serious questions about privacy and security -- as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district's surveillance technology. The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times. The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives. Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots. Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn't protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk. The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology's unintended consequences in American schools. In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe. Gaggle Safety Management, the company that developed the software that tracks Vancouver schools students' online activity, believes not monitoring children is like letting them loose on "a digital playground without fences or recess monitors," CEO and founder Jeff Patterson said. Roughly 1,500 school districts nationwide use Gaggle's software to track the online activity of approximately 6 million students. It's one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance. The technology has been in high demand since the pandemic, when nearly every child received a school-issued tablet or laptop. According to a U.S. Senate investigation, over 7,000 schools or districts used GoGuardian's surveillance products in 2021. Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students' well-being. "I don't think we could ever put a price on protecting students," said Andy Meyer, principal of Vancouver's Skyview High School. "Anytime we learn of something like that and we can intervene, we feel that is very positive." Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations. "That's not good at all," Foster said after learning the district inadvertently released the records. "But what are my options? What do I do? Pull my kid out of school?" Foster says she'd be upset if her daughter's private information was compromised. "At the same time," she said, "I would like to avoid a school shooting or suicide." Gaggle uses a machine-learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years -- approximately the cost of employing one extra counselor. The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check. A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately. "A lot of times, families don't know. We open that door for that help," the counselor said. Gaggle is "good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged." Seattle Times and AP reporters saw what kind of writing set off Gaggle's alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren't protected by a password. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so emergency contacts -- who often receive these alerts late at night on their phones -- can respond quickly. In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends. Foster's daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal's office after writing a short story featuring a scene with mildly violent imagery. "I'm glad they're being safe about it, but I also think it can be a bit much," Bryn said. School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly. "It allows me the opportunity to meet with a student I maybe haven't met before and build that relationship," said Chele Pierce, a Skyview High School counselor. Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district's enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert. While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There's no independent research showing it measurably lowers student suicide rates or reduces violence. A 2023 RAND study found only "scant evidence" of either benefits or risks from AI surveillance, concluding: "No research to date has comprehensively examined how these programs affect youth suicide prevention." "If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention," said report co-author Benjamin Boudreaux, an AI ethics researcher. In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support. "We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver," said Katy Pearce, a University of Washington professor who researches technology in authoritarian states. In one screenshot, a Vancouver high schooler wrote in a Google survey form they'd been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: "I am not a mandated reporter, please tell me the whole truth." When North Carolina's Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful. But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive. Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then "blindsided" when Gaggle alerted school officials about something private they'd disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle. "You can't just (surveil) people and not tell them. That's a horrible breach of security and trust," said Thompson, now a college student, in an interview. After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults. The debate over privacy and security is complicated, and parents are often unaware it's even an issue. Pearce, the University of Washington professor, doesn't remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district's responsible use form before her son received a school laptop. Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class. For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns. The district refused Reiland's request. When Reiland's daughter, Zoe, found out about Gaggle, she says she felt so "freaked out" that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn't want to get called into the office for "searching up lady parts." "I was too scared to be curious," she said. School officials say they don't track metrics measuring the technology's efficacy but believe it has saved lives. Yet technology alone doesn't create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with "deliberate indifference" to some families' reports of sexual harassment, mainly in the form of homophobic bullying. During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide. When asked why bullying remained a problem despite surveillance, Russell Thornton, the district's executive director of technology responded: "This is one tool used by administrators. Obviously, one tool is not going to solve the world's problems and bullying." Despite the risks, surveillance technology can help teachers intervene before a tragedy. A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former Superintendent Susan Enfield. "They knew that the staff member was reading what they were writing," Enfield said. "It was, in essence, that student's way of asking for help." Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support. "The idea that kids are constantly under surveillance by adults -- I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in," said Boudreaux, the AI ethics researcher. Gaggle's Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, "the school's going to be held liable," he said. "If you're looking for that open free expression, it really can't happen on the school system's computers." The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[3]
Takeaways From Our Investigation on AI-Powered School Surveillance
Thousands of American schools are turning to AI-powered surveillance technology for 24/7 monitoring of student accounts and school-issued devices like laptops and tablets. The goal is to keep children safe, especially amid a mental health crisis and the threat of school shootings. Machine-learning algorithms detect potential indicators of problems like bullying, self-harm or suicide and then alert school officials. But these tools raise serious questions about privacy and security. In fact, when The Seattle Times and The Associated Press partnered to investigate school surveillance, reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request. The documents were stored without a password or firewall, and anyone with the link could read them. Here are key takeaways from the investigation. Surveillance tech like Gaggle isn't always secure The privacy and security risks became apparent when Seattle Times and AP reporters submitted a public records request to Vancouver Public Schools in Washington, seeking information about the kind of content flagged by the monitoring tool Gaggle. Used by around 1,500 districts, Gaggle is one of many different companies offering surveillance services, including GoGuardian and Securly. Gaggle saved screenshots of digital activity that set off each alert. School officials accidentally provided the reporters with links to them, not realizing they weren't protected by a password. Students in these documents opened up about the most intimate aspects of their personal lives, including suicide attempts. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so the school's emergency contacts -- who often receive these alerts late at night on their phones -- can respond quickly. There's no independent research showing surveillance tech increases safety The long-term effects of surveillance technology on safety are unclear. No independent studies have shown it measurably lowers student suicide rates or reduces violence. A 2023 RAND report found only "scant evidence" of either benefits or risks from artificial intelligence surveillance. "If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention," said report co-author Benjamin Boudreaux, an AI ethics researcher. Experts warn having privacy to express feelings is important to healthy child development. But proponents of digital monitoring point out school computers are not the appropriate setting for this kind of unlimited self-exploration. LGBTQ+ students are particularly vulnerable Surveillance software poses unique risks to LGBTQ+ students, advocates warn. In the records released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. When Durham Public Schools in North Carolina piloted Gaggle, an LGBTQ+ advocate reported a Gaggle alert about self-harm had led to a student being outed to their family. Another student brought concerns about losing trust with teachers. The board voted to stop using the technology, finding it wasn't worth the risk of eroding relationships with adults. Parents often don't know their kids are being watched Parents interviewed for this article said their child's school either did not disclose it used surveillance software or buried the disclosure in long technology use forms. Even when families are aware of surveillance, schools may refuse to let them opt out. "Imagine growing up in a world where everything you've ever said on a computer is monitored by the government," said Tim Reiland, who unsuccessfully lobbied his kids' school district in Owasso, Oklahoma, to let his children opt out of Gaggle. "And you just have to accept it and move on. What kind of adults are we creating?" ____ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[4]
Schools use AI to monitor kids, but investigation finds risks
One student asked a search engine, "Why does my boyfriend hit me?" Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves. In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state. Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings. The goal is to keep children safe, but these tools raise serious questions about privacy and security -- as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district's surveillance technology. ___ The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times. ___ The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives. Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots. Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn't protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk. The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology's unintended consequences in American schools. In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe. Gaggle Safety Management, the company that developed the software that tracks Vancouver schools students' online activity, believes not monitoring children is like letting them loose on "a digital playground without fences or recess monitors," CEO and founder Jeff Patterson said. Roughly 1,500 school districts nationwide use Gaggle's software to track the online activity of approximately 6 million students. It's one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance. The technology has been in high demand since the pandemic, when nearly every child received a school-issued tablet or laptop. According to a U.S. Senate investigation, over 7,000 schools or districts used GoGuardian's surveillance products in 2021. Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students' well-being. "I don't think we could ever put a price on protecting students," said Andy Meyer, principal of Vancouver's Skyview High School. "Anytime we learn of something like that and we can intervene, we feel that is very positive." Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations. "That's not good at all," Foster said after learning the district inadvertently released the records. "But what are my options? What do I do? Pull my kid out of school?" Foster says she'd be upset if her daughter's private information was compromised. "At the same time," she said, "I would like to avoid a school shooting or suicide." Gaggle uses a machine-learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years -- approximately the cost of employing one extra counselor. The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check. A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately. "A lot of times, families don't know. We open that door for that help," the counselor said. Gaggle is "good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged." Seattle Times and AP reporters saw what kind of writing set off Gaggle's alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren't protected by a password. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so emergency contacts -- who often receive these alerts late at night on their phones -- can respond quickly. In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends. Foster's daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal's office after writing a short story featuring a scene with mildly violent imagery. "I'm glad they're being safe about it, but I also think it can be a bit much," Bryn said. School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly. "It allows me the opportunity to meet with a student I maybe haven't met before and build that relationship," said Chele Pierce, a Skyview High School counselor. Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district's enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert. While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There's no independent research showing it measurably lowers student suicide rates or reduces violence. A 2023 RAND study found only "scant evidence" of either benefits or risks from AI surveillance, concluding: "No research to date has comprehensively examined how these programs affect youth suicide prevention." "If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention," said report co-author Benjamin Boudreaux, an AI ethics researcher. In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support. "We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver," said Katy Pearce, a University of Washington professor who researches technology in authoritarian states. In one screenshot, a Vancouver high schooler wrote in a Google survey form they'd been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: "I am not a mandated reporter, please tell me the whole truth." When North Carolina's Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful. But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive. Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then "blindsided" when Gaggle alerted school officials about something private they'd disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle. "You can't just (surveil) people and not tell them. That's a horrible breach of security and trust," said Thompson, now a college student, in an interview. After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults. The debate over privacy and security is complicated, and parents are often unaware it's even an issue. Pearce, the University of Washington professor, doesn't remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district's responsible use form before her son received a school laptop. Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class. For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns. The district refused Reiland's request. When Reiland's daughter, Zoe, found out about Gaggle, she says she felt so "freaked out" that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn't want to get called into the office for "searching up lady parts." "I was too scared to be curious," she said. School officials say they don't track metrics measuring the technology's efficacy but believe it has saved lives. Yet technology alone doesn't create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with "deliberate indifference" to some families' reports of sexual harassment, mainly in the form of homophobic bullying. During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide. When asked why bullying remained a problem despite surveillance, Russell Thornton, the district's executive director of technology responded: "This is one tool used by administrators. Obviously, one tool is not going to solve the world's problems and bullying." Despite the risks, surveillance technology can help teachers intervene before a tragedy. A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former Superintendent Susan Enfield. "They knew that the staff member was reading what they were writing," Enfield said. "It was, in essence, that student's way of asking for help." Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support. "The idea that kids are constantly under surveillance by adults -- I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in," said Boudreaux, the AI ethics researcher. Gaggle's Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, "the school's going to be held liable," he said. "If you're looking for that open free expression, it really can't happen on the school system's computers." ____ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[5]
Takeaways from our investigation on AI-powered school surveillance
Thousands of American schools are turning to AI-powered surveillance technology for 24/7 monitoring of student accounts and school-issued devices like laptops and tablets. The goal is to keep children safe, especially amid a mental health crisis and the threat of school shootings. Machine-learning algorithms detect potential indicators of problems like bullying, self-harm or suicide and then alert school officials. But these tools raise serious questions about privacy and security. In fact, when The Seattle Times and The Associated Press partnered to investigate school surveillance, reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request. The documents were stored without a password or firewall, and anyone with the link could read them. Here are key takeaways from the investigation. The privacy and security risks became apparent when Seattle Times and AP reporters submitted a public records request to Vancouver Public Schools in Washington, seeking information about the kind of content flagged by the monitoring tool Gaggle. Used by around 1,500 districts, Gaggle is one of many different companies offering surveillance services, including GoGuardian and Securly. Gaggle saved screenshots of digital activity that set off each alert. School officials accidentally provided the reporters with links to them, not realizing they weren't protected by a password. Students in these documents opened up about the most intimate aspects of their personal lives, including suicide attempts. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so the school's emergency contacts -- who often receive these alerts late at night on their phones -- can respond quickly. The long-term effects of surveillance technology on safety are unclear. No independent studies have shown it measurably lowers student suicide rates or reduces violence. A 2023 RAND report found only "scant evidence" of either benefits or risks from artificial intelligence surveillance. "If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention," said report co-author Benjamin Boudreaux, an AI ethics researcher. Experts warn having privacy to express feelings is important to healthy child development. But proponents of digital monitoring point out school computers are not the appropriate setting for this kind of unlimited self-exploration. Surveillance software poses unique risks to LGBTQ+ students, advocates warn. In the records released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. When Durham Public Schools in North Carolina piloted Gaggle, an LGBTQ+ advocate reported a Gaggle alert about self-harm had led to a student being outed to their family. Another student brought concerns about losing trust with teachers. The board voted to stop using the technology, finding it wasn't worth the risk of eroding relationships with adults. Parents interviewed for this article said their child's school either did not disclose it used surveillance software or buried the disclosure in long technology use forms. Even when families are aware of surveillance, schools may refuse to let them opt out. "Imagine growing up in a world where everything you've ever said on a computer is monitored by the government," said Tim Reiland, who unsuccessfully lobbied his kids' school district in Owasso, Oklahoma, to let his children opt out of Gaggle. "And you just have to accept it and move on. What kind of adults are we creating?" The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[6]
Schools Use AI to Monitor Kids, Hoping to Prevent Violence. Our Investigation Found Security Risks
One student asked a search engine, "Why does my boyfriend hit me?" Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves. In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state. Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings. The goal is to keep children safe, but these tools raise serious questions about privacy and security -- as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district's surveillance technology. ___ The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times. ___ The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives. Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots. Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn't protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk. The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology's unintended consequences in American schools. In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe. Gaggle Safety Management, the company that developed the software that tracks Vancouver schools students' online activity, believes not monitoring children is like letting them loose on "a digital playground without fences or recess monitors," CEO and founder Jeff Patterson said. Roughly 1,500 school districts nationwide use Gaggle's software to track the online activity of approximately 6 million students. It's one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance. The technology has been in high demand since the pandemic, when nearly every child received a school-issued tablet or laptop. According to a U.S. Senate investigation, over 7,000 schools or districts used GoGuardian's surveillance products in 2021. Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students' well-being. "I don't think we could ever put a price on protecting students," said Andy Meyer, principal of Vancouver's Skyview High School. "Anytime we learn of something like that and we can intervene, we feel that is very positive." Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations. "That's not good at all," Foster said after learning the district inadvertently released the records. "But what are my options? What do I do? Pull my kid out of school?" Foster says she'd be upset if her daughter's private information was compromised. "At the same time," she said, "I would like to avoid a school shooting or suicide." How student surveillance works Gaggle uses a machine-learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years -- approximately the cost of employing one extra counselor. The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check. A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately. "A lot of times, families don't know. We open that door for that help," the counselor said. Gaggle is "good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged." Seattle Times and AP reporters saw what kind of writing set off Gaggle's alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren't protected by a password. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so emergency contacts -- who often receive these alerts late at night on their phones -- can respond quickly. In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends. Foster's daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal's office after writing a short story featuring a scene with mildly violent imagery. "I'm glad they're being safe about it, but I also think it can be a bit much," Bryn said. School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly. "It allows me the opportunity to meet with a student I maybe haven't met before and build that relationship," said Chele Pierce, a Skyview High School counselor. Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district's enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert. While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There's no independent research showing it measurably lowers student suicide rates or reduces violence. A 2023 RAND study found only "scant evidence" of either benefits or risks from AI surveillance, concluding: "No research to date has comprehensively examined how these programs affect youth suicide prevention." "If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention," said report co-author Benjamin Boudreaux, an AI ethics researcher. LGBTQ+ students are most vulnerable In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support. "We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver," said Katy Pearce, a University of Washington professor who researches technology in authoritarian states. In one screenshot, a Vancouver high schooler wrote in a Google survey form they'd been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: "I am not a mandated reporter, please tell me the whole truth." When North Carolina's Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful. But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive. Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then "blindsided" when Gaggle alerted school officials about something private they'd disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle. "You can't just (surveil) people and not tell them. That's a horrible breach of security and trust," said Thompson, now a college student, in an interview. After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults. Parents don't really know The debate over privacy and security is complicated, and parents are often unaware it's even an issue. Pearce, the University of Washington professor, doesn't remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district's responsible use form before her son received a school laptop. Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class. For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns. The district refused Reiland's request. When Reiland's daughter, Zoe, found out about Gaggle, she says she felt so "freaked out" that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn't want to get called into the office for "searching up lady parts." "I was too scared to be curious," she said. School officials say they don't track metrics measuring the technology's efficacy but believe it has saved lives. Yet technology alone doesn't create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with "deliberate indifference" to some families' reports of sexual harassment, mainly in the form of homophobic bullying. During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide. When asked why bullying remained a problem despite surveillance, Russell Thornton, the district's executive director of technology responded: "This is one tool used by administrators. Obviously, one tool is not going to solve the world's problems and bullying." Long-term effects unknown Despite the risks, surveillance technology can help teachers intervene before a tragedy. A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former Superintendent Susan Enfield. "They knew that the staff member was reading what they were writing," Enfield said. "It was, in essence, that student's way of asking for help." Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support. "The idea that kids are constantly under surveillance by adults -- I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in," said Boudreaux, the AI ethics researcher. Gaggle's Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, "the school's going to be held liable," he said. "If you're looking for that open free expression, it really can't happen on the school system's computers." ____ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
Share
Share
Copy Link
An investigation reveals the widespread use of AI-powered surveillance technology in American schools, raising concerns about student privacy and data security while aiming to address mental health and safety issues.
In an era of increasing mental health concerns and school safety threats, thousands of American schools have turned to AI-powered surveillance technology to monitor student activities on school-issued devices. This 24/7 monitoring system, implemented by companies like Gaggle, GoGuardian, and Securly, aims to detect potential indicators of problems such as bullying, self-harm, or suicide 12.
Gaggle Safety Management, one of the leading providers, uses machine learning algorithms to scan students' online activities, including searches, emails, and documents. The system flags potentially concerning content, which is then reviewed by human moderators before alerting school officials 3. With approximately 1,500 school districts nationwide using Gaggle's software to monitor about 6 million students, the reach of this technology is significant 24.
While the intention behind this technology is to keep children safe, an investigation by The Seattle Times and The Associated Press has revealed serious privacy and security risks. In a stark example of these risks, reporters inadvertently gained access to almost 3,500 sensitive, unredacted student documents through a public records request to Vancouver Public Schools in Washington state 12.
These documents contained deeply personal information, including students' struggles with depression, suicide attempts, addiction, and gender identity issues. The incident highlighted the potential for massive security breaches, as the files were not protected by firewalls or passwords 34.
The investigation also uncovered particular risks for LGBTQ+ students. In the Vancouver schools' records, at least six students were potentially outed to school officials after writing about being gay, transgender, or struggling with gender dysphoria 5. This raises concerns about the technology's potential to inadvertently disclose sensitive personal information without students' consent.
Despite the widespread adoption of these surveillance tools, there is a lack of independent research demonstrating their effectiveness in improving student safety or reducing suicide rates. A 2023 RAND report found only "scant evidence" of either benefits or risks from AI surveillance in schools 5.
The investigation also revealed that many parents are unaware that their children's online activities are being monitored. Some schools either do not disclose the use of surveillance software or bury the information in lengthy technology use forms. Even when parents are aware, they may not be given the option to opt out 5.
In response to the security issues uncovered by the investigation, Gaggle has updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the flagged content screenshots 34. However, this incident has sparked a broader debate about the balance between student safety and privacy in the digital age.
As schools continue to grapple with these complex issues, the use of AI-powered surveillance in educational settings remains a contentious topic, highlighting the need for more transparent policies, improved security measures, and ongoing discussions about the ethical implications of monitoring students' digital lives.
Reference
[1]
[2]
[3]
A mother sues Character.AI and Google after discovering chatbots impersonating her deceased son, raising concerns about AI safety and regulation.
3 Sources
3 Sources
A high school math teacher in California embraces AI tools in his classroom, sparking discussions about the potential benefits and ethical concerns of AI integration in education.
2 Sources
2 Sources
Recent investigations reveal alarming instances of AI chatbots being used for potentially harmful purposes, including grooming behaviors and providing information on illegal activities, raising serious ethical and safety concerns.
2 Sources
2 Sources
A mother sues Character.AI after her son's suicide, raising alarms about the safety of AI companions for teens and the need for better regulation in the rapidly evolving AI industry.
40 Sources
40 Sources
A disturbing trend of AI-generated nude images in schools highlights the urgent need for policy changes and increased awareness about the misuse of artificial intelligence technology.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved