4 Sources
4 Sources
[1]
Florida schools plan to vastly expand use of AI that mistook clarinet for gun
A Florida middle school was locked down last week after an AI security system called ZeroEyes mistook a clarinet for a gun, reviving criticism that AI may not be worth the high price schools pay for peace of mind. Human review of the AI-generated false flag did not stop police from rushing to Lawton Chiles Middle School. Cops expected to find "a man in the building, dressed in camouflage with a 'suspected weapon pointed down the hallway, being held in the position of a shouldered rifle,'" a Washington Post review of the police report said. Instead, after finding no evidence of a shooter, cops double-checked with dispatchers who confirmed that a closer look at the images indicated that "the suspected rifle might have been a band instrument." Among panicked students hiding in the band room, police eventually found the suspect, a student "dressed as a military character from the Christmas movie Red One for the school's Christmas-themed dress-up day," the Post reported. ZeroEyes cofounder Sam Alaimo told the Post that the AI performed exactly as it should have in this case, adopting a "better safe than sorry" outlook. A ZeroEyes spokesperson told Ars that "school resource officers, security directors and superintendents consistently ask us to be proactive and forward them an alert if there is any fraction of a doubt that the threat might be real." "We don't think we made an error, nor does the school," Alaimo said. "That was better to dispatch [police] than not dispatch." Cops left after the confused student confirmed he was "unaware" that the way he was holding his clarinet could have triggered that alert, the Post reported. But ZeroEyes' spokesperson claimed he was "intentionally holding the instrument in the position of a shouldered rifle." And seemingly rather than probe why the images weren't more carefully reviewed to prevent a false alarm on campus, the school appeared to agree with ZeroEyes and blame the student. "We did not make an error, and the school was pleased with the detection and their response," ZeroEyes' spokesperson said. School warns students not to trigger AI In a letter to parents, the principal, Melissa Laudani, reportedly told parents that "while there was no threat to campus, I'd like to ask you to speak with your student about the dangers of pretending to have a weapon on a school campus." Along similar lines, Seminole County Public Schools (SCPS) communications officer, Katherine Crnkovich, emphasized in an email to Ars to "please make sure it is noted that this student wasn't simply carrying a clarinet. This individual was holding it as if it were a weapon." However, warning students against brandishing ordinary objects like weapons isn't a perfect solution. Video footage from a Texas high school in 2023 showed that ZeroEyes can sometimes confuse shadows for guns, accidentally flagging a student simply walking into school as a potential threat. The advice also ignores that ZeroEyes last year reportedly triggered a lockdown and police response after detecting two theater kids using prop guns to rehearse a play. And a similar AI tool called Omnilert made national headlines confusing an empty Doritos bag with a gun, which led to a 14-year-old Baltimore sophomore's arrest. In that case, the student told the American Civil Liberties Union that he was just holding the chips when AI sent "like eight cop cars" to detain him. For years, school safety experts have warned that AI tools like ZeroEyes take up substantial resources even though they are "unproven," the Post reported. ZeroEyes' spokesperson told Ars that "in most cases, ZeroEyes customers will never receive a 'false positive,'" but the company is not transparent about how many false positives it receives or how many guns have been detected. An FAQ only notes that "we are always looking to minimize false positives and are constantly improving our learning models based on data collected." In March, as some students began questioning ZeroEyes after it flagged a Nerf gun at a Pennsylvania university, a nearby K-12 private school, Germantown Academy, confirmed that its "system often makes 'non-lethal' detections." One critic, school safety consultant Kenneth Trump, suggested in October that these tools are "security theater," with firms like ZeroEyes lobbying for taxpayer dollars by relying on what the ACLU called "misleading" marketing to convince schools that tools are proactive solutions to school shootings. Seemingly in response to this backlash, StateScoop reported that days after it began probing ZeroEyes in 2024, the company scrubbed a claim from its FAQ that said ZeroEyes "can prevent active shooter and mass shooting incidents." At Lawton Chiles Middle School, "the children were never in any danger," police confirmed, but experts question if false positives cause students undue stress and suspicion, perhaps doing more harm than good in absence of efficacy studies. Schools may be better off dedicating resources to mental health services proven to benefit kids, some critics have suggested. Laudani's letter encouraged parents to submit any questions they have about the incident, but it's hard to gauge if anyone's upset. Asked if parents were concerned or if ZeroEyes has ever triggered lockdown at other SCPS schools, Crnkovich told Ars that SCPS does not "provide details regarding the specific school safety systems we utilize." It's clear, however, that SCPS hopes to expand its use of ZeroEyes. In November, Florida state Senator Keith Truenow submitted a request to install "significantly more cameras" -- about 850 -- equipped with ZeroEyes across the school district. Truenow backed up his request for $500,000 in funding over the next year by claiming that "the more [ZeroEyes] coverage there is, the more protected students will be from potential gun violence." AI false alarms pose dangers to students ZeroEyes is among the most popular tools attracting heavy investments from schools in 48 states, which hope that AI gun detection will help prevent school shootings. The AI technology is embedded in security cameras, trained on images of people holding guns, and can supposedly "detect as little as an eighth of an inch of a gun," an ABC affiliate in New York reported. Monitoring these systems continually, humans review AI flags, then text any concerning images detected to school superintendents. Police are alerted when human review determines images may constitute actual threats. ZeroEyes' spokesperson told Ars that "it has detected more than 1,000 weapons in the last three years." Perhaps most notably, ZeroEyes "detected a minor armed with an AK-47 rifle on an elementary school campus in Texas," where no shots were fired, StateScoop reported last year. Schools invest tens or, as the SCPS case shows, even hundreds of thousands annually, the exact amount depending on the number of cameras they want to employ and other variables impacting pricing. ZeroEyes estimates that most schools pay $60 per camera monthly. Bigger contracts can discount costs. In Kansas, a statewide initiative equipping 25 cameras at 1,300 schools with ZeroEyes was reportedly estimated to cost $8.5 million annually. Doubling the number of cameras didn't provide much savings, though, with ZeroEyes looking to charge $15.2 million annually to expand coverage. To critics, it appears that ZeroEyes is attempting to corner the market on AI school security, standing to profit off schools' fears of shootings, while showing little proof of the true value of its systems. Last year, ZeroEyes reported its revenue grew 300 percent year over year from 2023 to 2024, after assisting in "more than ten arrests through its thousands of detections, verifications, and notifications to end users and law enforcement." Curt Lavarello, the executive director of the School Safety Advocacy Council, told the ABC News affiliate that "all of this technology is very, very expensive," considering that "a lot of products ... may not necessarily do what they're being sold to do." Another problem, according to experts who have responded to some of the country's deadliest school shootings, is that while ZeroEyes' human reviewers can alert police in "seconds," police response can often take "several minutes." That delay could diminish ZeroEyes' impact, one expert suggested, noting that at an Oregon school he responded to, there was a shooter who "shot 25 people in 60 seconds," StateScoop reported. In Seminole County, where the clarinet incident happened, ZeroEyes has been used since 2021, but SCPS would not confirm if any guns have ever been detected to justify next year's desired expansion. It's possible that SCPS has this information, as Sen. Truenow noted in his funding request that ZeroEyes can share reports with schools "to measure the effectiveness of the ZeroEyes deployment" by reporting on "how many guns were detected and alerted on campus." ZeroEyes' spokesperson told Ars that "trained former law enforcement and military make split-second, life-or-death decisions about whether the threat is real," which is supposed to help reduce false positives that could become more common as SCPS adds ZeroEyes to many more cameras. Amanda Klinger, the director of operations at the Educator's School Safety Network, told the Post that too many false alarms could carry two risks. First, more students could be put in dangerous situations when police descend on schools where they anticipate confronting an active shooter. And second, cops may become fatigued by false alarms, perhaps failing to respond with urgency over time. For students, when AI labels them as suspects, it can also be invasive and humiliating, reports noted. "We have to be really clear-eyed about what are the limitations of these technologies," Klinger said.
[2]
A school locked down after AI flagged a gun. It was a clarinet.
A false alarm was raised at a Seminole County middle school in Florida after an AI surveillance system flagged a clarinet as a gun. (Paul Hennessy/Associated Press) Police responded to the Florida middle school minutes after the alert arrived last week: Security cameras had detected a man in the building, dressed in camouflage with a "suspected weapon pointed down the hallway, being held in the position of a shouldered rifle." The Oviedo school went into lockdown. An officer searched classrooms but couldn't find the person or hear any commotion, according to a police report. Then dispatchers added another detail. Upon closer review of the image flagged to police, they told the officer, the suspected rifle might have been a band instrument. The officer went to where students were hiding in the band room. He found the culprit -- a student wearing a military costume for a themed dress-up day -- and the "suspected weapon": a clarinet. The gaffe occurred because an artificial-intelligence-powered surveillance system used by Lawton Chiles Middle School mistakenly flagged the clarinet as a weapon, according to ZeroEyes, the security company that runs the system and contracts with Lawton Chiles's school district. Like a growing number of school districts across the country, Seminole County Public Schools has turned to AI-powered surveillance to bolster campus security. ZeroEyes sells a threat-detection system that scans video surveillance footage for signs of weapons or contraband and alerts law enforcement when they are spotted. The appetite for such systems has grown in an era of frequent, high-profile school shootings -- such as the attack at Brown University on Saturday that killed two students and injured nine. Some school safety and privacy experts said the recent incident at the Florida middle school is part of a trend in which threat detection systems used by schools misfire, putting students under undue suspicion and stress. "These are unproven technologies that are marketed as providing a lot of certainty and security," said David Riedman, founder of the K-12 School Shooting Database. ZeroEyes said that trained employees review alerts before they are sent and that its software can make a lifesaving difference in averting mass shootings by alerting law enforcement to weapons on campus within seconds. At Lawton Chiles, the student flagged by ZeroEyes was holding his musical instrument like a rifle, co-founder Sam Alaimo told The Washington Post. "We don't think we made an error, nor does the school," Alaimo said. "That was better to dispatch [police] than not dispatch." Seminole County Public Schools declined to comment on Tuesday's incident, but it provided a copy of the letter it sent to parents of Lawton Chiles students after the incident. "While there was no threat to campus, I'd like to ask you to speak with your student about the dangers of pretending to have a weapon on a school campus," principal Melissa Laudani wrote. Concerns about student safety have pushed school districts across the country to embrace a growing industry of AI-assisted security tools that proactively flag threats to administrators and law enforcement. ZeroEyes spokesperson Olga Shmuklyer said its product is used in 48 states and that it has detected more than 1,000 weapons in the last three years. The systems are usually trained to detect a safety risk by reviewing volumes of sample data, such as images of people holding guns, to look for matches in real-time. They have sometimes made mistakes. In October, parents and officials in Baltimore County, Maryland, called for a review of a different AI threat-detection system after it confused a bag of Doritos chips for a gun and sent an alert that led to a high-schooler being handcuffed. In 2023, a high school in Clute, Texas, went into lockdown after ZeroEyes falsely alerted that a person was carrying a rifle, according to News 4 San Antonio. In one case, a different threat-detection system failed to avert a fatal school shooting. Antioch High School in Nashville was equipped with AI surveillance software to detect guns in January when a 17-year-old student killed a classmate in a shooting, according to CNN. The system missed the shooter because he was too far away from surveillance cameras to detect his weapon, CNN reported. Other systems that monitor students' activity on school devices have also been criticized for falsely accusing students and violating their privacy. In September, students at a Kansas high school sued their school district after a monitoring tool falsely flagged art projects as pornography. ZeroEyes has worked closely with Seminole County Public Schools since 2021, according to news reports and the company. That year, it held a live demonstration of the ZeroEyes system's ability to detect guns at Oviedo High School. "We've been very very pleased with the technology," Seminole County Schools Public Safety Director Richard Francis told Fox 35 News at the time. Alaimo, the ZeroEyes co-founder, said the company hires employees with military or law enforcement experience who are "calm under pressure and ... very good at identifying guns" to review potential threats flagged by AI, The image ZeroEyes flagged at Lawton Chiles showed the student appearing to aim his clarinet like a gun at a door and strongly resembled "a shooter about to do something bad," Alaimo said. The officer who responded to the alert questioned the student with the clarinet, according to the police report. The student said he was dressed as a military character from the Christmas movie "Red One" for the school's Christmas-themed dress-up day. The student said he was "unaware" he was holding his clarinet in a way that would have triggered an alert, according to the report. Police took no further action. Chad Marlow, a senior policy counsel at the American Civil Liberties Union who has studied school security systems, said incidents like the one at Lawton Chiles show that systems like ZeroEyes can still be fallible, even with humans reviewing the threats that AI flags. "If a computer technology is telling a ... human evaluator that they see a gun and that literally seconds may be critical, that person is going to err on the side of saying it's a weapon," he said. Amanda Klinger, the director of operations at the Educator's School Safety Network, added that false reports risk "alarm fatigue" and dangerous situations if armed police respond to a school looking for a shooter. "We have to be really clear-eyed about what are the limitations of these technologies," Klinger said. Alaimo said ZeroEyes -- and its partners at school districts -- would rather be safe than sorry. "A superintendent, a school resource officer, a chief of police, a director of security, they're going to say, 'Yes, be more proactive, be more inclined to give me the alert if you have a fraction of a doubt,'" Alaimo said. "Because they want to keep people safe."
[3]
AI Sends School Into Lockdown After It Mistook a Student's Clarinet for a Gun
"A student was walking in the hallway, holding a musical instrument as if it were a weapon, which triggered the Code Red to activate." Students at Lawton Chiles Middle School in Florida were sent scrambling into lockdown last week, after an alert from an AI surveillance system detected a student carrying a gun. There's just one issue: the "gun" was actually a band student's clarinet. The whole thing unraveled quickly, according to local reporting. When the alert went out, it triggered an automatic "code red," giving administrators no choice but to react to the AI system's decision. Luckily nobody was hurt, and local police soon declared the lockdown over. "The code red was a precaution and the children were never in any danger," local police wrote in a Facebook post. In a message to parents, school Principal Dr. Melissa Laudani said the district has "multiple layers of school safety, including an automated system that detects potential threats. A student was walking in the hallway, holding a musical instrument as if it were a weapon, which triggered the code red to activate." (It's not known what exactly constitutes a "code red," as it isn't mentioned in the school's latest Parent Student handbook.) Rather than blame the faulty AI system for the commotion -- without which the fiasco never would have happened in the first place -- the school blamed the young clarinetist. "While there was no threat to campus, I'd like to ask you to speak with your student about the dangers of pretending to have a weapon on a school campus," Laudani wrote. It's not known what particular system Lawton Chiles has on overwatch, but the fact that it can't differentiate between a clarinet with 17 keys and a rifle with none is concerning. The Lawton Chiles incident comes soon after another similar case, in which AI led to the violent detention of a 16-year old teen in Baltimore by at least eight officers with guns drawn. In that case, the school's AI had somehow misidentified a small bag of Doritos for a handgun, prompting a heavily armed response from city police. Like the story from Lawton Chiles, the Baltimore false positive could've easily been prevented had a human being been in the loop. Instead, it seems both systems allowed the AI to make the call without any challenge from a human being. Omniilert, the company behind the Dorito incident, insisted that its AI system "functioned as intended: to prioritize safety and awareness through rapid human verification." It all underscores just how important it is that these AI school surveillance systems actually work before they're forced into schools -- because when it comes to identifying students with guns, false positives can be just as deadly as a false negatives.
[4]
A school locked down after AI flagged a gun. It was a clarinet.
Police responded to the Florida middle school minutes after the alert arrived last week: Security cameras had detected a man in the building, dressed in camouflage with a "suspected weapon pointed down the hallway, being held in the position of a shouldered rifle." The Oviedo school went into lockdown. An officer searched classrooms but couldn't find the person or hear any commotion, according to a police report. Then dispatchers added another detail. Upon closer review of the image flagged to police, they told the officer, the suspected rifle might have been a band instrument. The officer went to where students were hiding in the band room. He found the culprit -- a student wearing a military costume for a themed dress-up day -- and the "suspected weapon": a clarinet. The gaffe occurred because an artificial-intelligence-powered surveillance system used by Lawton Chiles Middle School mistakenly flagged the clarinet as a weapon, according to ZeroEyes, the security company that runs the system and contracts with Lawton Chiles's school district. Like a growing number of school districts across the country, Seminole County Public Schools has turned to AI-powered surveillance to bolster campus security. ZeroEyes sells a threat-detection system that scans video surveillance footage for signs of weapons or contraband and alerts law enforcement when they are spotted. The appetite for such systems has grown in an era of frequent, high-profile school shootings -- such as the attack at Brown University on Saturday that killed two students and injured nine. Some school safety and privacy experts said the recent incident at the Florida middle school is part of a trend in which threat detection systems used by schools misfire, putting students under undue suspicion and stress. "These are unproven technologies that are marketed as providing a lot of certainty and security," said David Riedman, founder of the K-12 School Shooting Database. Riedman was employed by ZeroEyes as a director of industry research in September 2023, and his employment ended in termination that year, according to ZeroEyes and Riedman. ZeroEyes said that trained employees review alerts before they are sent and that its software can make a lifesaving difference in averting mass shootings by alerting law enforcement to weapons on campus within seconds. At Lawton Chiles, the student flagged by ZeroEyes was holding his musical instrument like a rifle, co-founder Sam Alaimo told The Washington Post. "We don't think we made an error, nor does the school," Alaimo said. "That was better to dispatch [police] than not dispatch." Seminole County Public Schools declined to comment on Tuesday's incident, but it provided a copy of the letter it sent to parents of Lawton Chiles students after the incident. "While there was no threat to campus, I'd like to ask you to speak with your student about the dangers of pretending to have a weapon on a school campus," principal Melissa Laudani wrote. Concerns about student safety have pushed school districts across the country to embrace a growing industry of AI-assisted security tools that proactively flag threats to administrators and law enforcement. ZeroEyes spokesperson Olga Shmuklyer said its product is used in 48 states and that it has detected more than 1,000 weapons in the last three years. The systems are usually trained to detect a safety risk by reviewing volumes of sample data, such as images of people holding guns, to look for matches in real time. They have sometimes made mistakes. In October, parents and officials in Baltimore County, Maryland, called for a review of a different AI threat-detection system after it confused a bag of Doritos chips for a gun and sent an alert that led to a high schooler being handcuffed. In 2023, a high school in Clute, Texas, went into lockdown after ZeroEyes falsely alerted that a person was carrying a rifle, according to News 4 San Antonio. In one case, a different threat-detection system failed to avert a fatal school shooting. Antioch High School in Nashville was equipped with AI surveillance software to detect guns in January when a 17-year-old student killed a classmate in a shooting, according to CNN. The system missed the shooter because he was too far away from surveillance cameras to detect his weapon, CNN reported. Other systems that monitor students' activity on school devices have also been criticized for falsely accusing students and violating their privacy. In September, students at a Kansas high school sued their school district after a monitoring tool falsely flagged art projects as pornography. ZeroEyes has worked closely with Seminole County Public Schools since 2021, according to news reports and the company. That year, it held a live demonstration of the ZeroEyes system's ability to detect guns at Oviedo High School. "We've been very very pleased with the technology," Seminole County Schools Public Safety Director Richard Francis told Fox 35 News at the time. Alaimo, the ZeroEyes co-founder, said the company hires employees with military or law enforcement experience who are "calm under pressure and ... very good at identifying guns" to review potential threats flagged by AI, The image ZeroEyes flagged at Lawton Chiles showed the student appearing to aim his clarinet like a gun at a door and strongly resembled "a shooter about to do something bad," Alaimo said. The officer who responded to the alert questioned the student with the clarinet, according to the police report. The student said he was dressed as a military character from the Christmas movie "Red One" for the school's Christmas-themed dress-up day. The student said he was "unaware" he was holding his clarinet in a way that would have triggered an alert, according to the report. Police took no further action. Chad Marlow, a senior policy counsel at the American Civil Liberties Union who has studied school security systems, said incidents like the one at Lawton Chiles show that systems like ZeroEyes can still be fallible, even with humans reviewing the threats that AI flags. "If a computer technology is telling a ... human evaluator that they see a gun and that literally seconds may be critical, that person is going to err on the side of saying it's a weapon," he said. Amanda Klinger, the director of operations at the Educator's School Safety Network, added that false reports risk "alarm fatigue" and dangerous situations if armed police respond to a school looking for a shooter. "We have to be really clear-eyed about what are the limitations of these technologies," Klinger said. Alaimo said ZeroEyes -- and its partners at school districts -- would rather be safe than sorry. "A superintendent, a school resource officer, a chief of police, a director of security, they're going to say, 'Yes, be more proactive, be more inclined to give me the alert if you have a fraction of a doubt,'" Alaimo said. "Because they want to keep people safe."
Share
Share
Copy Link
An AI-powered surveillance system at Lawton Chiles Middle School in Florida flagged a student's clarinet as a weapon, triggering a full police response. The student was dressed as a military character for a themed dress-up day. Despite human review, officers rushed to the scene expecting an armed threat. The incident highlights growing concerns about AI threat-detection systems in schools, with critics calling them unproven technologies that cause undue stress while companies defend their better-safe-than-sorry approach.
Lawton Chiles Middle School in Seminole County, Florida, went into full lockdown last week after an AI-powered surveillance system misidentified a clarinet as a gun
1
. Police responded to reports of a man in camouflage with a "suspected weapon pointed down the hallway, being held in the position of a shouldered rifle," according to the police report2
. The AI security system, operated by ZeroEyes, had flagged the image and sent it through human review before dispatching law enforcement to the scene.
Source: Futurism
Officers searched classrooms but found no evidence of a shooter or commotion. Dispatchers later clarified that upon closer examination, the suspected rifle might have been a band instrument
4
. Police eventually located the "suspect" in the band room—a student dressed as a military character from the Christmas movie Red One for the school's themed dress-up day, holding a clarinet1
.ZeroEyes cofounder Sam Alaimo told The Washington Post that the AI threat-detection system performed exactly as intended, adopting a "better safe than sorry" approach
2
. A ZeroEyes spokesperson emphasized that "school resource officers, security directors and superintendents consistently ask us to be proactive and forward them an alert if there is any fraction of a doubt that the threat might be real"1
.The company insists the student was "intentionally holding the instrument in the position of a shouldered rifle," though the confused student told police he was "unaware" his clarinet-holding position could trigger an alert
1
. Rather than acknowledge any system limitations, both ZeroEyes and the school placed responsibility on the student. Principal Melissa Laudani wrote to parents asking them to "speak with your student about the dangers of pretending to have a weapon on a school campus"2
.This incident is far from isolated. In October, a different AI threat-detection system in Baltimore County, Maryland, confused a bag of Doritos for a handgun, leading to a 14-year-old being handcuffed by at least eight officers
1
. The student told the American Civil Liberties Union he was simply holding chips when AI sent "like eight cop cars" to detain him1
.ZeroEyes itself has generated multiple false positives. In 2023, a Texas high school went into lockdown after the system confused shadows for guns, flagging a student simply walking into school
1
. The company also reportedly triggered a lockdown and police response after detecting theater students using prop guns during play rehearsal1
. Even a nearby Pennsylvania private school, Germantown Academy, confirmed its ZeroEyes system "often makes 'non-lethal' detections"1
.
Source: Ars Technica
Related Stories
David Riedman, founder of the K-12 School Shooting Database, described these tools as "unproven technologies that are marketed as providing a lot of certainty and security"
2
. School safety consultant Kenneth Trump went further in October, calling these systems "security theater," suggesting firms like ZeroEyes rely on misleading marketing to secure taxpayer dollars1
.ZeroEyes lacks transparency about its effectiveness. The company told Ars Technica that "in most cases, ZeroEyes customers will never receive a 'false positive,'" but provides no data on how many false alarms occur or how many actual weapons have been detected
1
. A spokesperson said the product is used in 48 states and has detected more than 1,000 weapons in the last three years2
. Notably, after StateScoop began investigating ZeroEyes in 2024, the company scrubbed a claim from its FAQ stating it "can prevent active shooter and mass shooting incidents"1
.Experts question whether false positives cause students undue stress and suspicion, potentially doing more harm than good in the absence of efficacy studies . The clarinet incident at Lawton Chiles left panicked students hiding in the band room, though police confirmed "the children were never in any danger"
1
.The effectiveness of AI-powered surveillance systems remains questionable even in actual emergencies. Antioch High School in Nashville was equipped with AI surveillance software for gun detection in January when a 17-year-old student killed a classmate in a shooting. The system missed the shooter because he was too far from surveillance cameras to detect his weapon
4
.Despite the controversy, Seminole County Public Schools has worked closely with ZeroEyes since 2021 and appears satisfied with the technology. Public Safety Director Richard Francis told Fox 35 News the district has "been very very pleased with the technology"
2
. The incident raises urgent questions about whether schools are better off dedicating resources to alternative school safety measures rather than AI systems that generate false alarms while lacking proven track records in preventing gun violence.Summarized by
Navi
[2]
[4]
24 Jan 2025•Technology

24 Oct 2025•Technology

18 Jun 2025•Technology
1
Technology

2
Technology

3
Technology
