5 Sources
[1]
Schools are using AI surveillance to protect students. It also leads to false alarms -- and arrests
Lesley Mathis knows what her daughter said was wrong. But she never expected the 13-year-old girl would get arrested for it. The teenage girl made an offensive joke while chatting online with her classmates, triggering the school's surveillance software. Before the morning was even over, the Tennessee eighth grader was under arrest. She was interrogated, strip-searched and spent the night in a jail cell, her mother says. Earlier in the day, her friends had teased the teen about her tanned complexion and called her "Mexican," even though she's not. When a friend asked what she was planning for Thursday, she wrote: "on Thursday we kill all the Mexico's." Mathis said the comments were "wrong" and "stupid," but context showed they were not a threat. "It made me feel like, is this the America we live in?" Mathis said of her daughter's arrest. "And it was this stupid, stupid technology that is just going through picking up random words and not looking at context." Surveillance systems in American schools increasingly monitor everything students write on school accounts and devices. Thousands of school districts across the country use software like Gaggle and Lightspeed Alert to track kids' online activities, looking for signs they might hurt themselves or others. With the help of artificial intelligence, technology can dip into online conversations and immediately notify both school officials and law enforcement. Educators say the technology has saved lives. But critics warn it can criminalize children for careless words. "It has routinized law enforcement access and presence in students' lives, including in their home," said Elizabeth Laird, a director at the Center for Democracy and Technology. Schools ratchet up vigilance for threats In a country weary of school shootings, several states have taken a harder line on threats to schools. Among them is Tennessee, which passed a 2023 zero-tolerance law requiring any threat of mass violence against a school to be reported immediately to law enforcement. The 13-year-old girl arrested in August 2023 had been texting with friends on a chat function tied to her school email at Fairview Middle School, which uses Gaggle to monitor students' accounts. (The Associated Press is withholding the girl's name to protect her privacy. The school district did not respond to a request for comment.) Taken to jail, the teen was interrogated and strip-searched, and her parents weren't allowed to talk to her until the next day, according to a lawsuit they filed against the school system. She didn't know why her parents weren't there. "She told me afterwards, 'I thought you hated me.' That kind of haunts you," said Mathis, the girl's mother. A court ordered eight weeks of house arrest, a psychological evaluation and 20 days at an alternative school for the girl. Gaggle's CEO, Jeff Patterson, said in an interview that the school system did not use Gaggle the way it is intended. The purpose is to find early warning signs and intervene before problems escalate to law enforcement, he said. "I wish that was treated as a teachable moment, not a law enforcement moment," said Patterson. Private student chats face unexpected scrutiny Students who think they are chatting privately among friends often do not realize they are under constant surveillance, said Shahar Pasch, an education lawyer in Florida. One teenage girl she represented made a joke about school shootings on a private Snapchat story. Snapchat's automated detection software picked up the comment, the company alerted the FBI, and the girl was arrested on school grounds within hours. Alexa Manganiotis, 16, said she was startled by how quickly monitoring software works. West Palm Beach's Dreyfoos School of the Arts, which she attends, last year piloted Lightspeed Alert, a surveillance program. Interviewing a teacher for her school newspaper, Alexa discovered two students once typed something threatening about that teacher on a school computer, then deleted it. Lightspeed picked it up, and "they were taken away like five minutes later," Alexa said. Teenagers face steeper consequences than adults for what they write online, Alexa said. "If an adult makes a super racist joke that's threatening on their computer, they can delete it, and they wouldn't be arrested," she said. Amy Bennett, chief of staff for Lightspeed Systems, said that the software helps understaffed schools "be proactive rather than punitive" by identifying early warning signs of bullying, self-harm, violence or abuse. The technology can also involve law enforcement in responses to mental health crises. In Florida's Polk County Schools, a district of more than 100,000 students, the school safety program received nearly 500 Gaggle alerts over four years, officers said in public Board of Education meetings. This led to 72 involuntary hospitalization cases under the Baker Act, a state law that allows authorities to require mental health evaluations for people against their will if they pose a risk to themselves or others. "A really high number of children who experience involuntary examination remember it as a really traumatic and damaging experience -- not something that helps them with their mental health care," said Sam Boyd, an attorney with the Southern Poverty Law Center. The Polk and West Palm Beach school districts did not provide comments. An analysis shows a high rate of false alarms Information that could allow schools to assess the software's effectiveness, such as the rate of false alerts, is closely held by technology companies and unavailable publicly unless schools track the data themselves. Gaggle alerted more than 1,200 incidents to the Lawrence, Kansas, school district in a recent 10-month period. But almost two-thirds of those alerts were deemed by school officials to be non-issues -- including over 200 false alarms from student homework, according to an Associated Press analysis of data received via a public records request. Students in one photography class were called to the principal's office over concerns Gaggle had detected nudity. The photos had been automatically deleted from the students' Google Drives, but students who had backups of the flagged images on their own devices showed it was a false alarm. District officials said they later adjusted the software's settings to reduce false alerts. Natasha Torkzaban, who graduated in 2024, said she was flagged for editing a friend's college essay because it had the words "mental health." "I think ideally we wouldn't stick a new and shiny solution of AI on a deep-rooted issue of teenage mental health and the suicide rates in America, but that's where we're at right now," Torkzaban said. She was among a group of student journalists and artists at Lawrence High School who filed a lawsuit against the school system last week, alleging Gaggle subjected them to unconstitutional surveillance. School officials have said they take concerns about Gaggle seriously, but also say the technology has detected dozens of imminent threats of suicide or violence. "Sometimes you have to look at the trade for the greater good," said Board of Education member Anne Costello in a July 2024 board meeting. Two years after their ordeal, Mathis said her daughter is doing better, although she's still "terrified" of running into one of the school officers who arrested her. One bright spot, she said, was the compassion of the teachers at her daughter's alternative school. They took time every day to let the kids share their feelings and frustrations, without judgment. "It's like we just want kids to be these little soldiers, and they're not," said Mathis. "They're just humans."
[2]
Schools are using AI surveillance to protect students. It also leads to false alarms -- and arrests
Lesley Mathis knows what her daughter said was wrong. But she never expected the 13-year-old girl would get arrested for it. The teenage girl made an offensive joke while chatting online with her classmates, triggering the school's surveillance software. Before the morning was even over, the Tennessee eighth grader was under arrest. She was interrogated, strip-searched and spent the night in a jail cell, her mother says. Earlier in the day, her friends had teased the teen about her tanned complexion and called her "Mexican," even though she's not. When a friend asked what she was planning for Thursday, she wrote: "on Thursday we kill all the Mexico's." Mathis said the comments were "wrong" and "stupid," but context showed they were not a threat. "It made me feel like, is this the America we live in?" Mathis said of her daughter's arrest. "And it was this stupid, stupid technology that is just going through picking up random words and not looking at context." Surveillance systems in American schools increasingly monitor everything students write on school accounts and devices. Thousands of school districts across the country use software like Gaggle and Lightspeed Alert to track kids' online activities, looking for signs they might hurt themselves or others. With the help of artificial intelligence, technology can dip into online conversations and immediately notify both school officials and law enforcement. Educators say the technology has saved lives. But critics warn it can criminalize children for careless words. "It has routinized law enforcement access and presence in students' lives, including in their home," said Elizabeth Laird, a director at the Center for Democracy and Technology. In a country weary of school shootings, several states have taken a harder line on threats to schools. Among them is Tennessee, which passed a 2023 zero-tolerance law requiring any threat of mass violence against a school to be reported immediately to law enforcement. The 13-year-old girl arrested in August 2023 had been texting with friends on a chat function tied to her school email at Fairview Middle School, which uses Gaggle to monitor students' accounts. (The Associated Press is withholding the girl's name to protect her privacy. The school district did not respond to a request for comment.) Taken to jail, the teen was interrogated and strip-searched, and her parents weren't allowed to talk to her until the next day, according to a lawsuit they filed against the school system. She didn't know why her parents weren't there. "She told me afterwards, 'I thought you hated me.' That kind of haunts you," said Mathis, the girl's mother. A court ordered eight weeks of house arrest, a psychological evaluation and 20 days at an alternative school for the girl. Gaggle's CEO, Jeff Patterson, said in an interview that the school system did not use Gaggle the way it is intended. The purpose is to find early warning signs and intervene before problems escalate to law enforcement, he said. "I wish that was treated as a teachable moment, not a law enforcement moment," said Patterson. Students who think they are chatting privately among friends often do not realize they are under constant surveillance, said Shahar Pasch, an education lawyer in Florida. One teenage girl she represented made a joke about school shootings on a private Snapchat story. Snapchat's automated detection software picked up the comment, the company alerted the FBI, and the girl was arrested on school grounds within hours. Alexa Manganiotis, 16, said she was startled by how quickly monitoring software works. West Palm Beach's Dreyfoos School of the Arts, which she attends, last year piloted Lightspeed Alert, a surveillance program. Interviewing a teacher for her school newspaper, Alexa discovered two students once typed something threatening about that teacher on a school computer, then deleted it. Lightspeed picked it up, and "they were taken away like five minutes later," Alexa said. Teenagers face steeper consequences than adults for what they write online, Alexa said. "If an adult makes a super racist joke that's threatening on their computer, they can delete it, and they wouldn't be arrested," she said. Amy Bennett, chief of staff for Lightspeed Systems, said that the software helps understaffed schools "be proactive rather than punitive" by identifying early warning signs of bullying, self-harm, violence or abuse. The technology can also involve law enforcement in responses to mental health crises. In Florida's Polk County Schools, a district of more than 100,000 students, the school safety program received nearly 500 Gaggle alerts over four years, officers said in public Board of Education meetings. This led to 72 involuntary hospitalization cases under the Baker Act, a state law that allows authorities to require mental health evaluations for people against their will if they pose a risk to themselves or others. "A really high number of children who experience involuntary examination remember it as a really traumatic and damaging experience -- not something that helps them with their mental health care," said Sam Boyd, an attorney with the Southern Poverty Law Center. The Polk and West Palm Beach school districts did not provide comments. Information that could allow schools to assess the software's effectiveness, such as the rate of false alerts, is closely held by technology companies and unavailable publicly unless schools track the data themselves. Gaggle alerted more than 1,200 incidents to the Lawrence, Kansas, school district in a recent 10-month period. But almost two-thirds of those alerts were deemed by school officials to be non-issues -- including over 200 false alarms from student homework, according to an Associated Press analysis of data received via a public records request. Students in one photography class were called to the principal's office over concerns Gaggle had detected nudity. The photos had been automatically deleted from the students' Google Drives, but students who had backups of the flagged images on their own devices showed it was a false alarm. District officials said they later adjusted the software's settings to reduce false alerts. Natasha Torkzaban, who graduated in 2024, said she was flagged for editing a friend's college essay because it had the words "mental health." "I think ideally we wouldn't stick a new and shiny solution of AI on a deep-rooted issue of teenage mental health and the suicide rates in America, but that's where we're at right now," Torkzaban said. She was among a group of student journalists and artists at Lawrence High School who filed a lawsuit against the school system last week, alleging Gaggle subjected them to unconstitutional surveillance. School officials have said they take concerns about Gaggle seriously, but also say the technology has detected dozens of imminent threats of suicide or violence. "Sometimes you have to look at the trade for the greater good," said Board of Education member Anne Costello in a July 2024 board meeting. Two years after their ordeal, Mathis said her daughter is doing better, although she's still "terrified" of running into one of the school officers who arrested her. One bright spot, she said, was the compassion of the teachers at her daughter's alternative school. They took time every day to let the kids share their feelings and frustrations, without judgment. "It's like we just want kids to be these little soldiers, and they're not," said Mathis. "They're just humans." ___ This reporting reviewed school board meetings posted on YouTube, courtesy of DistrictView, a dataset created by researchers Tyler Simko, Mirya Holman and Rebecca Johnson. ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[3]
Schools are using AI to spy on students and some are getting arrested for misinterpreted jokes and private conversations
Lesley Mathis knows what her daughter said was wrong. But she never expected the 13-year-old girl would get arrested for it. The teenage girl made an offensive joke while chatting online with her classmates, triggering the school's surveillance software. Before the morning was even over, the Tennessee eighth grader was under arrest. She was interrogated, strip-searched and spent the night in a jail cell, her mother says. Earlier in the day, her friends had teased the teen about her tanned complexion and called her "Mexican," even though she's not. When a friend asked what she was planning for Thursday, she wrote: "on Thursday we kill all the Mexico's." Mathis said the comments were "wrong" and "stupid," but context showed they were not a threat. "It made me feel like, is this the America we live in?" Mathis said of her daughter's arrest. "And it was this stupid, stupid technology that is just going through picking up random words and not looking at context." Surveillance systems in American schools increasingly monitor everything students write on school accounts and devices. Thousands of school districts across the country use software like Gaggle and Lightspeed Alert to track kids' online activities, looking for signs they might hurt themselves or others. With the help of artificial intelligence, technology can dip into online conversations and immediately notify both school officials and law enforcement. Educators say the technology has saved lives. But critics warn it can criminalize children for careless words. "It has routinized law enforcement access and presence in students' lives, including in their home," said Elizabeth Laird, a director at the Center for Democracy and Technology. In a country weary of school shootings, several states have taken a harder line on threats to schools. Among them is Tennessee, which passed a 2023 zero-tolerance law requiring any threat of mass violence against a school to be reported immediately to law enforcement. The 13-year-old girl arrested in August 2023 had been texting with friends on a chat function tied to her school email at Fairview Middle School, which uses Gaggle to monitor students' accounts. (The Associated Press is withholding the girl's name to protect her privacy. The school district did not respond to a request for comment.) Taken to jail, the teen was interrogated and strip-searched, and her parents weren't allowed to talk to her until the next day, according to a lawsuit they filed against the school system. She didn't know why her parents weren't there. "She told me afterwards, 'I thought you hated me.' That kind of haunts you," said Mathis, the girl's mother. A court ordered eight weeks of house arrest, a psychological evaluation and 20 days at an alternative school for the girl. Gaggle's CEO, Jeff Patterson, said in an interview that the school system did not use Gaggle the way it is intended. The purpose is to find early warning signs and intervene before problems escalate to law enforcement, he said. "I wish that was treated as a teachable moment, not a law enforcement moment," said Patterson. Students who think they are chatting privately among friends often do not realize they are under constant surveillance, said Shahar Pasch, an education lawyer in Florida. One teenage girl she represented made a joke about school shootings on a private Snapchat story. Snapchat's automated detection software picked up the comment, the company alerted the FBI, and the girl was arrested on school grounds within hours. Alexa Manganiotis, 16, said she was startled by how quickly monitoring software works. West Palm Beach's Dreyfoos School of the Arts, which she attends, last year piloted Lightspeed Alert, a surveillance program. Interviewing a teacher for her school newspaper, Alexa discovered two students once typed something threatening about that teacher on a school computer, then deleted it. Lightspeed picked it up, and "they were taken away like five minutes later," Alexa said. Teenagers face steeper consequences than adults for what they write online, Alexa said. "If an adult makes a super racist joke that's threatening on their computer, they can delete it, and they wouldn't be arrested," she said. Amy Bennett, chief of staff for Lightspeed Systems, said that the software helps understaffed schools "be proactive rather than punitive" by identifying early warning signs of bullying, self-harm, violence or abuse. The technology can also involve law enforcement in responses to mental health crises. In Florida's Polk County Schools, a district of more than 100,000 students, the school safety program received nearly 500 Gaggle alerts over four years, officers said in public Board of Education meetings. This led to 72 involuntary hospitalization cases under the Baker Act, a state law that allows authorities to require mental health evaluations for people against their will if they pose a risk to themselves or others. "A really high number of children who experience involuntary examination remember it as a really traumatic and damaging experience -- not something that helps them with their mental health care," said Sam Boyd, an attorney with the Southern Poverty Law Center. The Polk and West Palm Beach school districts did not provide comments. Information that could allow schools to assess the software's effectiveness, such as the rate of false alerts, is closely held by technology companies and unavailable publicly unless schools track the data themselves. Gaggle alerted more than 1,200 incidents to the Lawrence, Kansas, school district in a recent 10-month period. But almost two-thirds of those alerts were deemed by school officials to be nonissues -- including over 200 false alarms from student homework, according to an Associated Press analysis of data received via a public records request. Students in one photography class were called to the principal's office over concerns Gaggle had detected nudity. The photos had been automatically deleted from the students' Google Drives, but students who had backups of the flagged images on their own devices showed it was a false alarm. District officials said they later adjusted the software's settings to reduce false alerts. Natasha Torkzaban, who graduated in 2024, said she was flagged for editing a friend's college essay because it had the words "mental health." "I think ideally we wouldn't stick a new and shiny solution of AI on a deep-rooted issue of teenage mental health and the suicide rates in America, but that's where we're at right now," Torkzaban said. She was among a group of student journalists and artists at Lawrence High School who filed a lawsuit against the school system last week, alleging Gaggle subjected them to unconstitutional surveillance. School officials have said they take concerns about Gaggle seriously, but also say the technology has detected dozens of imminent threats of suicide or violence. "Sometimes you have to look at the trade for the greater good," said Board of Education member Anne Costello in a July 2024 board meeting. Two years after their ordeal, Mathis said her daughter is doing better, although she's still "terrified" of running into one of the school officers who arrested her. One bright spot, she said, was the compassion of the teachers at her daughter's alternative school. They took time every day to let the kids share their feelings and frustrations, without judgment. "It's like we just want kids to be these little soldiers, and they're not," said Mathis. "They're just humans." ___ This reporting reviewed school board meetings posted on YouTube, courtesy of DistrictView, a dataset created by researchers Tyler Simko, Mirya Holman and Rebecca Johnson. The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[4]
Schools are using AI surveillance to protect students. It also leads to false alarms -- and arrests
Lesley Mathis knows what her daughter said was wrong. But she never expected the 13-year-old girl would get arrested for it. The teenage girl made an offensive joke while chatting online with her classmates, triggering the school's surveillance software. Before the morning was even over, the Tennessee eighth grader was under arrest. She was interrogated, strip-searched and spent the night in a jail cell, her mother says. Earlier in the day, her friends had teased the teen about her tanned complexion and called her "Mexican," even though she's not. When a friend asked what she was planning for Thursday, she wrote: "on Thursday we kill all the Mexico's." Mathis said the comments were "wrong" and "stupid," but context showed they were not a threat. "It made me feel like, is this the America we live in?" Mathis said of her daughter's arrest. "And it was this stupid, stupid technology that is just going through picking up random words and not looking at context." Surveillance systems in American schools increasingly monitor everything students write on school accounts and devices. Thousands of school districts across the country use software like Gaggle and Lightspeed Alert to track kids' online activities, looking for signs they might hurt themselves or others. With the help of artificial intelligence, technology can dip into online conversations and immediately notify both school officials and law enforcement. Educators say the technology has saved lives. But critics warn it can criminalize children for careless words. "It has routinized law enforcement access and presence in students' lives, including in their home," said Elizabeth Laird, a director at the Center for Democracy and Technology. Schools ratchet up vigilance for threats In a country weary of school shootings, several states have taken a harder line on threats to schools. Among them is Tennessee, which passed a 2023 zero-tolerance law requiring any threat of mass violence against a school to be reported immediately to law enforcement. The 13-year-old girl arrested in August 2023 had been texting with friends on a chat function tied to her school email at Fairview Middle School, which uses Gaggle to monitor students' accounts. (The Associated Press is withholding the girl's name to protect her privacy. The school district did not respond to a request for comment.) Taken to jail, the teen was interrogated and strip-searched, and her parents weren't allowed to talk to her until the next day, according to a lawsuit they filed against the school system. She didn't know why her parents weren't there. "She told me afterwards, 'I thought you hated me.' That kind of haunts you," said Mathis, the girl's mother. A court ordered eight weeks of house arrest, a psychological evaluation and 20 days at an alternative school for the girl. Gaggle's CEO, Jeff Patterson, said in an interview that the school system did not use Gaggle the way it is intended. The purpose is to find early warning signs and intervene before problems escalate to law enforcement, he said. "I wish that was treated as a teachable moment, not a law enforcement moment," said Patterson. Private student chats face unexpected scrutiny Students who think they are chatting privately among friends often do not realize they are under constant surveillance, said Shahar Pasch, an education lawyer in Florida. One teenage girl she represented made a joke about school shootings on a private Snapchat story. Snapchat's automated detection software picked up the comment, the company alerted the FBI, and the girl was arrested on school grounds within hours. Alexa Manganiotis, 16, said she was startled by how quickly monitoring software works. West Palm Beach's Dreyfoos School of the Arts, which she attends, last year piloted Lightspeed Alert, a surveillance program. Interviewing a teacher for her school newspaper, Alexa discovered two students once typed something threatening about that teacher on a school computer, then deleted it. Lightspeed picked it up, and "they were taken away like five minutes later," Alexa said. Teenagers face steeper consequences than adults for what they write online, Alexa said. "If an adult makes a super racist joke that's threatening on their computer, they can delete it, and they wouldn't be arrested," she said. Amy Bennett, chief of staff for Lightspeed Systems, said that the software helps understaffed schools "be proactive rather than punitive" by identifying early warning signs of bullying, self-harm, violence or abuse. The technology can also involve law enforcement in responses to mental health crises. In Florida's Polk County Schools, a district of more than 100,000 students, the school safety program received nearly 500 Gaggle alerts over four years, officers said in public Board of Education meetings. This led to 72 involuntary hospitalization cases under the Baker Act, a state law that allows authorities to require mental health evaluations for people against their will if they pose a risk to themselves or others. "A really high number of children who experience involuntary examination remember it as a really traumatic and damaging experience -- not something that helps them with their mental health care," said Sam Boyd, an attorney with the Southern Poverty Law Center. The Polk and West Palm Beach school districts did not provide comments. An analysis shows a high rate of false alarms Information that could allow schools to assess the software's effectiveness, such as the rate of false alerts, is closely held by technology companies and unavailable publicly unless schools track the data themselves. Gaggle alerted more than 1,200 incidents to the Lawrence, Kansas, school district in a recent 10-month period. But almost two-thirds of those alerts were deemed by school officials to be non-issues -- including over 200 false alarms from student homework, according to an Associated Press analysis of data received via a public records request. Students in one photography class were called to the principal's office over concerns Gaggle had detected nudity. The photos had been automatically deleted from the students' Google Drives, but students who had backups of the flagged images on their own devices showed it was a false alarm. District officials said they later adjusted the software's settings to reduce false alerts. Natasha Torkzaban, who graduated in 2024, said she was flagged for editing a friend's college essay because it had the words "mental health." "I think ideally we wouldn't stick a new and shiny solution of AI on a deep-rooted issue of teenage mental health and the suicide rates in America, but that's where we're at right now," Torkzaban said. She was among a group of student journalists and artists at Lawrence High School who filed a lawsuit against the school system last week, alleging Gaggle subjected them to unconstitutional surveillance. School officials have said they take concerns about Gaggle seriously, but also say the technology has detected dozens of imminent threats of suicide or violence. "Sometimes you have to look at the trade for the greater good," said Board of Education member Anne Costello in a July 2024 board meeting. Two years after their ordeal, Mathis said her daughter is doing better, although she's still "terrified" of running into one of the school officers who arrested her. One bright spot, she said, was the compassion of the teachers at her daughter's alternative school. They took time every day to let the kids share their feelings and frustrations, without judgment. "It's like we just want kids to be these little soldiers, and they're not," said Mathis. "They're just humans." ___ This reporting reviewed school board meetings posted on YouTube, courtesy of DistrictView, a dataset created by researchers Tyler Simko, Mirya Holman and Rebecca Johnson. ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
[5]
Schools are using AI to protect students. It also leads to false alarms -- and arrests
Surveillance systems in American schools increasingly monitor everything students write on school accounts and devices Lesley Mathis knows what her daughter said was wrong. But she never expected the 13-year-old girl would get arrested for it. The teenage girl made an offensive joke while chatting online with her classmates, triggering the school's surveillance software. Before the morning was even over, the Tennessee eighth grader was under arrest. She was interrogated, strip-searched and spent the night in a jail cell, her mother says. Earlier in the day, her friends had teased the teen about her tanned complexion and called her "Mexican," even though she's not. When a friend asked what she was planning for Thursday, she wrote: "on Thursday we kill all the Mexico's." Mathis said the comments were "wrong" and "stupid," but context showed they were not a threat. "It made me feel like, is this the America we live in?" Mathis said of her daughter's arrest. "And it was this stupid, stupid technology that is just going through picking up random words and not looking at context." Surveillance systems in American schools increasingly monitor everything students write on school accounts and devices. Thousands of school districts across the country use software like Gaggle and Lightspeed Alert to track kids' online activities, looking for signs they might hurt themselves or others. With the help of artificial intelligence, technology can dip into online conversations and immediately notify both school officials and law enforcement. Educators say the technology has saved lives. But critics warn it can criminalize children for careless words. "It has routinized law enforcement access and presence in students' lives, including in their home," said Elizabeth Laird, a director at the Center for Democracy and Technology. In a country weary of school shootings, several states have taken a harder line on threats to schools. Among them is Tennessee, which passed a 2023 zero-tolerance law requiring any threat of mass violence against a school to be reported immediately to law enforcement. The 13-year-old girl arrested in August 2023 had been texting with friends on a chat function tied to her school email at Fairview Middle School, which uses Gaggle to monitor students' accounts. (The Associated Press is withholding the girl's name to protect her privacy. The school district did not respond to a request for comment.) Taken to jail, the teen was interrogated and strip-searched, and her parents weren't allowed to talk to her until the next day, according to a lawsuit they filed against the school system. She didn't know why her parents weren't there. "She told me afterwards, 'I thought you hated me.' That kind of haunts you," said Mathis, the girl's mother. A court ordered eight weeks of house arrest, a psychological evaluation and 20 days at an alternative school for the girl. Gaggle's CEO, Jeff Patterson, said in an interview that the school system did not use Gaggle the way it is intended. The purpose is to find early warning signs and intervene before problems escalate to law enforcement, he said. "I wish that was treated as a teachable moment, not a law enforcement moment," said Patterson. Students who think they are chatting privately among friends often do not realize they are under constant surveillance, said Shahar Pasch, an education lawyer in Florida. One teenage girl she represented made a joke about school shootings on a private Snapchat story. Snapchat's automated detection software picked up the comment, the company alerted the FBI, and the girl was arrested on school grounds within hours. Alexa Manganiotis, 16, said she was startled by how quickly monitoring software works. West Palm Beach's Dreyfoos School of the Arts, which she attends, last year piloted Lightspeed Alert, a surveillance program. Interviewing a teacher for her school newspaper, Alexa discovered two students once typed something threatening about that teacher on a school computer, then deleted it. Lightspeed picked it up, and "they were taken away like five minutes later," Alexa said. Teenagers face steeper consequences than adults for what they write online, Alexa said. "If an adult makes a super racist joke that's threatening on their computer, they can delete it, and they wouldn't be arrested," she said. Amy Bennett, chief of staff for Lightspeed Systems, said that the software helps understaffed schools "be proactive rather than punitive" by identifying early warning signs of bullying, self-harm, violence or abuse. The technology can also involve law enforcement in responses to mental health crises. In Florida's Polk County Schools, a district of more than 100,000 students, the school safety program received nearly 500 Gaggle alerts over four years, officers said in public Board of Education meetings. This led to 72 involuntary hospitalization cases under the Baker Act, a state law that allows authorities to require mental health evaluations for people against their will if they pose a risk to themselves or others. "A really high number of children who experience involuntary examination remember it as a really traumatic and damaging experience -- not something that helps them with their mental health care," said Sam Boyd, an attorney with the Southern Poverty Law Center. The Polk and West Palm Beach school districts did not provide comments. Information that could allow schools to assess the software's effectiveness, such as the rate of false alerts, is closely held by technology companies and unavailable publicly unless schools track the data themselves. Gaggle alerted more than 1,200 incidents to the Lawrence, Kansas, school district in a recent 10-month period. But almost two-thirds of those alerts were deemed by school officials to be non-issues -- including over 200 false alarms from student homework, according to an Associated Press analysis of data received via a public records request. Students in one photography class were called to the principal's office over concerns Gaggle had detected nudity. The photos had been automatically deleted from the students' Google Drives, but students who had backups of the flagged images on their own devices showed it was a false alarm. District officials said they later adjusted the software's settings to reduce false alerts. Natasha Torkzaban, who graduated in 2024, said she was flagged for editing a friend's college essay because it had the words "mental health." "I think ideally we wouldn't stick a new and shiny solution of AI on a deep-rooted issue of teenage mental health and the suicide rates in America, but that's where we're at right now," Torkzaban said. She was among a group of student journalists and artists at Lawrence High School who filed a lawsuit against the school system last week, alleging Gaggle subjected them to unconstitutional surveillance. School officials have said they take concerns about Gaggle seriously, but also say the technology has detected dozens of imminent threats of suicide or violence. "Sometimes you have to look at the trade for the greater good," said Board of Education member Anne Costello in a July 2024 board meeting. Two years after their ordeal, Mathis said her daughter is doing better, although she's still "terrified" of running into one of the school officers who arrested her. One bright spot, she said, was the compassion of the teachers at her daughter's alternative school. They took time every day to let the kids share their feelings and frustrations, without judgment. "It's like we just want kids to be these little soldiers, and they're not," said Mathis. "They're just humans." ___ This reporting reviewed school board meetings posted on YouTube, courtesy of DistrictView, a dataset created by researchers Tyler Simko, Mirya Holman and Rebecca Johnson. ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.
Share
Copy Link
Schools across the US are using AI-powered surveillance software to monitor students' online activities, leading to both life-saving interventions and controversial arrests over misinterpreted comments.
In an effort to protect students from potential harm, thousands of school districts across the United States are implementing AI-powered surveillance systems to monitor students' online activities. Software like Gaggle and Lightspeed Alert are being used to track everything students write on school accounts and devices, looking for signs of self-harm or threats to others 123.
While educators claim the technology has saved lives, critics warn that it can lead to the criminalization of children for careless words. A case in point is the arrest of a 13-year-old girl in Tennessee for making an offensive joke in an online chat with classmates 12345.
Source: Phys.org
The incident occurred when the girl's friends teased her about her tanned complexion, calling her "Mexican." In response to a friend's question about her plans for Thursday, she wrote: "on Thursday we kill all the Mexico's." This comment triggered the school's surveillance software, leading to her arrest, interrogation, and strip-search before spending a night in a jail cell 12345.
The case highlights the potential for AI surveillance to misinterpret context and escalate situations unnecessarily. Lesley Mathis, the girl's mother, expressed shock at the severity of the response, stating, "It made me feel like, is this the America we live in? And it was this stupid, stupid technology that is just going through picking up random words and not looking at context" 12345.
The incident occurred against the backdrop of Tennessee's 2023 zero-tolerance law, which requires any threat of mass violence against a school to be reported immediately to law enforcement. This policy, combined with AI surveillance, has led to increased involvement of law enforcement in school disciplinary matters 1234.
Critics argue that the widespread use of surveillance technology has "routinized law enforcement access and presence in students' lives, including in their home," according to Elizabeth Laird, a director at the Center for Democracy and Technology 12345.
The technology's reach extends beyond school grounds, with one case involving a Florida teenager who was arrested for a joke made on a private Snapchat story. The incident demonstrates how surveillance can blur the lines between private and public communication 12345.
While proponents argue that the technology helps identify early warning signs of bullying, self-harm, and violence, there are concerns about the high rate of false alarms. In the Lawrence, Kansas school district, Gaggle alerted more than 1,200 incidents in a recent 10-month period 12345.
The use of AI surveillance has also led to increased involvement of law enforcement in mental health crises. In Florida's Polk County Schools, nearly 500 Gaggle alerts over four years resulted in 72 involuntary hospitalization cases under the Baker Act 12345.
As schools grapple with the challenge of keeping students safe while respecting their privacy and rights, the debate over AI surveillance in education continues. The case of the Tennessee teenager serves as a stark reminder of the potential consequences of relying too heavily on technology to monitor and manage student behavior.
NVIDIA announces significant upgrades to its GeForce NOW cloud gaming service, including RTX 5080-class performance, improved streaming quality, and an expanded game library, set to launch in September 2025.
9 Sources
Technology
1 hr ago
9 Sources
Technology
1 hr ago
As nations compete for dominance in space, the risk of satellite hijacking and space-based weapons escalates, transforming outer space into a potential battlefield with far-reaching consequences for global security and economy.
7 Sources
Technology
17 hrs ago
7 Sources
Technology
17 hrs ago
OpenAI updates GPT-5 to make it more approachable following user feedback, sparking debate about AI personality and user preferences.
6 Sources
Technology
9 hrs ago
6 Sources
Technology
9 hrs ago
A pro-Russian propaganda group, Storm-1679, is using AI-generated content and impersonating legitimate news outlets to spread disinformation, raising concerns about the growing threat of AI-powered fake news.
2 Sources
Technology
17 hrs ago
2 Sources
Technology
17 hrs ago
A study reveals patients' increasing reliance on AI for medical advice, often trusting it over doctors. This trend is reshaping doctor-patient dynamics and raising concerns about AI's limitations in healthcare.
3 Sources
Health
9 hrs ago
3 Sources
Health
9 hrs ago