13 Sources
13 Sources
[1]
High school's AI security system confuses Doritos bag for a possible firearm | TechCrunch
A high school student in Baltimore County, Maryland was reportedly handcuffed and searched after an AI security system flagged his bag of chips as a possible firearm. Taki Allen, a student at Kenwood High School, told CNN affiliate WBAL, "I was just holding a Doritos bag -- it was two hands and one finger out, and they said it looked like a gun." But as a result, Allen said, "They made me get on my knees, put my hands behind my back, and cuffed me." In a statement shared with parents, Principal Katie Smith said the school's security department had reviewed and canceled a gun detection alert, while Smith (who didn't immediately realize the alert had been canceled) reported the situation to the school resource officer, who called the local police. Omnilert, the company that operates the AI gun detection system, told CNN, "We regret that this incident occurred and wish to convey our concern to the student and the wider community affected by the events that followed." Nonetheless, Omnilert said "the process functioned as intended."
[2]
Cops alerted by AI gun detection system arrest high school student holding bag of Doritos -- eight cars sent to disarm chip-toting teen
After Omnilert AI detects a gun on campus, there is supposed to be a human verification step before the cops are called. A young student was left traumatized after being ordered to the ground and handcuffed by police because an AI gun detection system erroneously called the cops on his Doritos habit. Taki Allen ate the bag of chips while waiting to be picked up from Kenwood High School, Baltimore, last Monday night (Oct 20), reports WBAL-TV 11 News. Football practice was over, and the student was sitting with friends outside the school. However, his crunchy repast triggered the school's security camera Omnilert AI system. 20 minutes after he began chomping on the savory corn-based treat, eight police cars arrived in response to Allen's snack habit. He was quickly ordered to his knees by armed police, and his hands were cuffed behind his back. "It was a scary situation," Allen explained to WBAL-TV. It didn't take long before the AI error became apparent to all involved. Police were pleasingly transparent about the AI snafu, though. According to the student's interview with local news, he was shown an image explaining the sizable police response. However, the picture puzzled him. "I was just holding a Doritos bag -- it was two hands and one finger out, and they said it looked like a gun," Allen said to WBAL-TV 11 News. Sadly, the TV news cameras didn't turn their attention to Allen's explanatory gesturing when he seemingly demonstrated the pose that got him cuffed. We also haven't seen a copy of the AI-triggering scene from the night. So, we are left with a cautionary tale without clear guidance about how to safely cradle a bag of Doritos. Omnilert was the AI gun detection software company behind this high-profile firearms hallucination. The firm refused to give WBAL-TV 11 News any comment on the emergency response callout error. It said that it "doesn't comment on internal school procedures" (WBAL-TV quote, not Omnilert's exact words). We checked out Omnilert's School Security Systems product pages for more information about how its product works. One of the big attractions of Omnilert is that it can piggyback on arrays of security cameras that are already installed. Its publicity material name checks school shooting tragedies like Uvalde, Sandy Hook, and Parkland, thus implying that Omnilert might have prevented them. Omnilert's active shooter and gun detection system is purportedly a three-step process. After a positive AI gun detection, there supposedly follows a human verification step, before the automated notification and emergency response. However, we can't point to any of those processes being in error if the police also thought the Doritos-in-hand photo was 'gun-like' enough to send a response team of eight cars. The WBAL-TV report suggests that the officers had a copy of the AI-triggering scene with them, to show the astonished Allen, but we aren't 100% clear about the reveal timeline. Omnilert and the school have offered to provide counseling to the students involved in the incident.
[3]
High school student surrounded by police and handcuffed after AI mistakes his bag of Doritos for a gun
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. WTF?! Even as billions are poured into AI and its use becomes increasingly widespread, we're still seeing very worrying instances of the technology getting things wrong. A prime example happened at a high school when a student was swarmed by police after an AI thought the Doritos bag he was holding was a gun. Taki Allen, a 16-year-old high school student at Kenwood High School in Baltimore County, Maryland, got an unwelcome surprise as he sat outside with friends after football practice while waiting to be picked up last week. About 20 minutes after he finished the bag of Doritos he had been eating, around eight police cars showed up. Officers climbed out of the cars and, with their weapons pointing at Allen, told him to put his hands on his head and walk toward them. The student said he was then told to get on his knees and was placed in handcuffs. Baltimore County Police Department later confirmed that Allen was handcuffed but not arrested. Unfortunately for Allen, the artificial intelligence that reviews the school's surveillance camera footage mistook the crumpled Doritos bag for a firearm and alerted the police. "Do you have a gun on you?" one officer asked the boy as the others searched his friends. After reviewing the footage that the AI had identified as a gun, police looked in a nearby trash can and found the bag of chips. "I guess just the way you were eating chips ... Doritos, whatever .... it picked it up as a gun," one officer said. So, remember: never eat chips the wrong way. Another cop added that "AI is not the best," which is an understatement. School Superintendent Dr. Myriam Rogers told reporters that the system worked how it was meant to. "The program is based on human verification and in this case the program did what it was supposed to do which was to signal an alert and for humans to take a look to find out if there was cause for concern in that moment," Rogers said. Understandably, Allen didn't share Rogers' point of view. "I don't think no chip bag should be mistaken for a gun at all." School principal Kate Smith sent a letter to parents following the incident emphasizing the importance of student safety, but Allen said Smith never spoke to him until three days after it happened. The mistake has led to calls from local politicians for the school's use of AI-based surveillance technology to be reviewed. Allen said he now waits inside after football practice, as he does not think it is "safe enough to go outside, especially eating a bag of chips or drinking something." Omnilert, the developer of the AI system used by the school, told BBC News, "We regret this incident occurred and wish to convey our concern to the student and the wider community affected by the events that followed." "While the object was later determined not to be a firearm, the process functioned as intended: to prioritise safety and awareness through rapid human verification," it said.
[4]
Armed police surround teen after AI mistakes crisp packet for gun
Mr Allen said he now waits inside after football practice, as he does not think it is "safe enough to go outside, especially eating a bag of chips or drinking something". In a letter to parents, school principal Kate Smith said the school's safety team "quickly reviewed and cancelled the initial alert after confirming there was no weapon". "I contacted our school resource officer (SRO) and reported the matter to him, and he contacted the local precinct for additional support," she said. "Police officers responded to the school, searched the individual and quickly confirmed that they were not in possession of any weapons." However, local politicians have called for further investigation into the incident. "I am calling on Baltimore County Public Schools to review procedures around its AI-powered weapon detection system," Baltimore County local councilman Izzy Pakota wrote on Facebook. The BBC has approached Omnilert, the reported provider of the AI tool, for comment. Omnilert says it is a "leading provider" of AI gun detection - citing a number of US schools among its case studies on its website. The company claims its tech uses real, diverse data, leading to "more reliable detection, fewer false positives, and a system that actually works where it matters most". "Real-world gun detection is messy," it states. "Lighting varies, weapons come in all shapes, and environments are full of noise and movement. "Our data-centric methodology trains AI to succeed in these exact scenarios -- because we use real data from real conditions, not simulations." But Mr Allen said: "I don't think no chip bag should be mistaken for a gun at all." The adequacy of AI to accurately identify weapons has been subject to scrutiny. Last year, a US weapons scanning company Evolv Technology was banned from making unsupported claims about its products after saying its AI scanner, used in thousands of US schools, hospitals and stadiums entrances, could detect all weapons.
[5]
Teen Swarmed by Cops After AI Metal Detector Flags His Doritos Bag as a Gun
Further evidence that artificial intelligence is not all that intelligent has been provided by an unfortunate incident in Baltimore, where, thanks to an AI-guided security system, local police nearly arrested a teenager for the crime of...eating Doritos? According to NBC affiliate WBAL-TV 11, a teenager who entered his high school's campus through an automated security system had a crumpled-up bag of chips in his pocket. The system appears to have flagged the bag as a weapon, the outlet writes. The boy in question, Taki Allen, says that, following his football practice, he was sitting outside the school with a group of his friends when a large band of police officers showed up. "It was like eight cop cars that came pulling up for us. At first, I didn't know where they were going until they started walking toward me with guns, talking about, 'Get on the ground,' and I was like, 'What?'" Allen told WBAL-TV 11 News. "They made me get on my knees, put my hands behind my back, and cuffed me," the teen added. "Then, they searched me, and they figured out I had nothing. Then, they went over to where I was standing and found a bag of chips on the floor." When asked what he was thinking about as the ordeal unfolded, Allen replied: “It was mainly like, am I gonna die? Are they going to kill me? They showed me the picture, said that looks like a gun, I said, â€~no, it’s chips.’†A statement provided by the school's principal to the news outlet sheds more light on the incident: At approximately 7 p.m., school administration received an alert that an individual on school grounds may have been in possession of a weapon. The Department of School Safety and Security quickly reviewed and canceled the initial alert after confirming there was no weapon. I contacted our school resource officer (SRO) and reported the matter to him, and he contacted the local precinct for additional support. Police officers responded to the school, searched the individual and quickly confirmed that they were not in possession of any weapons. Neither police nor school officials have confirmed the involvement of the Doritos bag, but they haven't denied it either. Gizmodo reached out to Allen's school, Kenwood High School, as well as to Baltimore County police for comment. WBAL-TV 11 says the company behind the detector, Omnilert (which calls itself a "pioneer in AI-powered active shooter prevention technology"), provides security systems for Baltimore County Public Schools. The outlet says that Allen's school began using the company's software last year to detect potential threats to campus. Omnilert's website states that it sells an AI gun detection solution to schools. Gizmodo reached out to Omnilert for comment.
[6]
US student handcuffed after AI system apparently mistook bag of chips for firearm
Baltimore county high schools have gun detection system that alerts police if it sees what it deems suspicious An artificial intelligence system (AI) apparently mistook a high school student's bag of Doritos for a firearm and called local police to tell them the pupil was armed. Taki Allen was sitting with friends on Monday night outside Kenwood high school in Baltimore and eating a snack when police officers with guns approached him. "At first, I didn't know where they were going until they started walking toward me with guns, talking about, 'Get on the ground,' and I was like, 'What?'" Allen told the WBAL-TV 11 News television station. Allen said they made him get on his knees, handcuffed and searched him - finding nothing. They then showed him a copy of the picture that had triggered the alert. "I was just holding a Doritos bag - it was two hands and one finger out, and they said it looked like a gun," Allen said. Baltimore county high schools last year began using a gun detection system using school cameras and AI to detect potential weapons. If it spots something it believes to be suspicious, it sends an alert to the school and law enforcement. In a letter to school families obtained by WBAL TV 11 News, the school wrote: "We understand how upsetting this was for the individual that was searched as well as the other students who witnessed the incident. Our counselors will provide direct support to the students who were involved in this incident and are also available to speak with any student who may need support." Baltimore county police told the outlet: "Officers assigned to Precinct 11-Essex responded to Kenwood High School following a report of a suspicious person with a weapon. Once on scene, the person was searched and it was determined the subject was not in possession of any weapons." Lamont Davis, Allen's grandfather, told the television station: "Nobody wants this to happen to their child. No one wants this to happen."
[7]
An AI Mistook a Doritos Bag for a Gun and Called the Cops on a Teenager
"If I eat another bag of chips or drink something, I feel like they're going to come again." An AI-powered gun detection system hooked up to a Baltimore County high school's cameras mistook a bag of Doritos chips as a weapon -- and called the cops on a 16-year-old student. As local news station WBAL-TV 11 News reports, Taki Allen was enjoying the snack while sitting outside of Kenwood High School after football practice. Twenty minutes later, he was visited by a small army of heavily-armed police officers. "It was like eight cop cars that came pulling up for us," he told WBAL-TV 11 News. "At first, I didn't know where they were going until they started walking toward me with guns, talking about, 'Get on the ground,' and I was like, 'What?'" "They made me get on my knees, put my hands behind my back, and cuffed me," Allen added. "Then, they searched me and they figured out I had nothing." "I was just holding a Doritos bag -- it was two hands and one finger out, and they said it looked like a gun," the student said. The incident highlights glaring shortcomings with current gun detection systems, which are being rolled out at schools across the country. That's not to mention the problematic privacy concerns of monitoring students with flawed AI tech or the outsized role law enforcement plays in public schools. Besides false positives, gun identification software has proven unable to prevent deadly shootings, such as the one at Antioch High School in suburban Nashville earlier this year. Other systems focused on gun detection have previously been accused of furthering racial biases, raising the possibility that Black students, like Allen, could be facing AI-facilitated discrimination while spending time at school. The Baltimore County Public Schools system rolled out Virginia-based startup Omnilert's gun detection tech last year. Once hooked up to public cameras, it can scan surveillance footage and alert police to potential weapons in real time. Omnilert is only one of a whole host of US-based startups aiming to put an end to gun violence at schools, a demonstrably flawed alternative to gun control regulation. According to the Baltimore Banner, Omnilert's tech analyzes image frames from 7,000 school cameras for suspicious activity. "Because the image closely resembled a gun being held, it was verified and forwarded to the Baltimore County Public Schools safety team within seconds for their assessment and decision-making," Omnilert spokesperson Blake Mitchell told the Baltimore Banner. "Even as we look at it now, with full awareness that it's not a gun, it still looks like to most people like one," he conceded. According to FOX45 News, Omnilert later called the latest incident a "false positive" but maintained that it "functioned as intended: to prioritize safety and awareness through rapid human verification." Besides being scared for his life, Allen told FOX45 News that he had never received an apology from the school. "They just told me it was protocol," he said. "I was expecting at least somebody to talk to me about it." It's a horrifying incident, highlighting how flawed tech is needlessly instilling fear in the hearts of innocent students. "I don't feel like going out there anymore," Allen told FOX45. "If I eat another bag of chips or drink something, I feel like they're going to come again." Allen's relatives are understandably calling for more oversight. "There was no threat for eight guns to be pointed at a 16-year-old," his grandfather, Lamont Davis, told the Baltimore Banner.
[8]
Multiple police cars showed up to a school after surveillance system monitored by AI flagged a bag of Doritos as a gun: 'AI is not the best'
"The program did what it was supposed to do," claims the AI software's manufacturer. AI is slowly making its way into every facet of life, and that means more and more fields are being manned by machines that can't be held accountable. In many cases, AI going wrong is normally confined to a poor Google search or bad Photoshop, but on October 20, police showed up armed at a high school as a result of an AI miscalculation. As reported by WMAR 2 News, multiple police cars showed up to a high school in Baltimore County, Maryland, to apprehend a student who AI flagged as having a gun (bodycam footage from ABC 7 Chicago). All students on the premises at 7:23 PM were checked by the police officers, and no gun was found. It wasn't until three minutes after apprehending the students that the police officers spotted a blue bag of Doritos in the bin, and put together what had gone wrong. The Omnilert AI gun detection system, which monitors the school's cameras, alerted the school that a gun may be on the grounds, and the authorities were informed as a result. In an attempt to explain what had happened to the students, an officer says, "I guess just the way you guys were eating chips... Doritos or whatever, it picked it up as a gun." After the problem was resolved and the culprit (a bag of Doritos) was found, one of the police officers told a student, "AI is not the best". A spokesperson for Omnilert told CBS, "Within moments, the event was marked as resolved in our system. Omnilert's involvement concluded at that point, and the system operated as designed -- detecting a possible threat, routing it for human review, and ensuring rapid, informed decision-making." Baltimore County Public Schools superintendent Dr Myriam Roger claims, "The program did what it was supposed to do, which was to signal an alert and for humans to take a look to find out if there was cause for concern at that moment." Effectively, Omnilert is intended to flag any problems to higher authorities, who are then supposed to manually check suspicious activity, and their job is to verify real threats. Rogers says, "This system is AI, and it's looking for certain elements, and then humans have to verify them." Omnilert reportedly 'welcomes' any "review or discussion that helps the public better understand how our system works." Iazzy Patoka and Julian Jones, two councilmen from Baltimore County, reportedly want to review the Omnilert system in the wake of this incident. Jones says, "Thank god it was not worse", and asks, "How did it come to be that we had police officers with guns drawn approaching a kid because of a bag of Doritos?" Taki Allen, the student identified, told WMAR2 News, "Now, I feel like sometimes after practice I don't go outside anymore. Cause if I go outside, I don't want - don't think I'm safe enough to go outside, especially eating a bag of chips or drinking something. I just stay inside until my ride comes."
[9]
Body cam shows Baltimore County officers stunned after gun scare was just a bag of chips
WJZ has obtained the body cam footage of police confronting a student at Kenwood High School after an A.I. gun detection system mistakenly detected that a student had a weapon. Baltimore County leaders are now calling for a review of that system. During the footage from Monday's incident, officers are seen approaching the student, searching him, and then stunned when they themselves realize what A.I. flagged as a gun was just a bag of Doritos chips. With guns drawn, Baltimore County Police surrounded a group of students after the Omnilert AI Gun Detection System warned school leaders that a student had a gun. Body camera footage shows police detaining all of those students and then searching one of them. Though the student never had a gun. Instead, what Omnilert detected was a bag of chips. After confirming there was no weapon, the Department of School Safety and Security reviewed and canceled the initial alert. "Just so you guys are aware...basically, the cameras around the system, they pick up on things that look like guns...I guess just the way you were eating chips...Doritos, whatever.... it picked it up as a gun," an officer explained. Officials with the school and Omnilert said the system was working properly, but county leaders have called for its review, concerned that the false alarm traumatized the students. "How did it come to be that we had police officers with guns drawn approaching a kid because of a bag of Doritos?" said Julian Jones, Baltimore County Councilman. "...the program did what it was supposed to do, which was signal an alert and for humans to take a look to find out if there was cause for concern in that moment," Myriam Rodgers, the superintendent of Baltimore County Public Schools, explained during a press conference. As police conclude their investigation, one of the responding officers is heard pointing out the system's faults on bodycam footage, stating, "AI is not the best." Kenwood's principal wrote in a letter to the school community that counseling will be provided to the students who were involved in the incident and will be available to any student who may need support.
[10]
AI mistakes a bag of Doritos for a gun, calls the cops
TL;DR: An AI gun detection system at a Baltimore County school mistakenly identified a student holding a bag of Doritos as carrying a firearm, leading to his handcuffing and police intervention. The incident raises concerns about AI accuracy and the potential consequences of false weapon alerts in school security systems. According to a local news report from WBAL-TV 11 News, an "artificial intelligence detector" called the cops on a student at a Baltimore County school for mistaking a bag of Doritos for a gun. Apparently, the student was outside his high school, eating some Doritos and chatting to friends when the Omnilert AI system notified the police that a gun-carrying individual was on school property. 'Omnilert: AI Gun Detection & Emergency Response Technology' is a security and proactive system that automatically notifies law enforcement and emergency services when it detects firearms within a "fraction of a second." According to Omnilert's official website, there's a human verification step with one of its certified operators before a call is placed to the police, so it's unclear how a bag of chips could have been mistaken for a gun. The student in question, Taki Allen, told reporters that "eight cop cars" pulled up with officers pointing their guns at him, asking him to get on the ground, on the suspicion that he was carrying a firearm. "They made me get on my knees, put my hands behind my back, and cuffed me," Allen said. "Then, they searched me and they figured out I had nothing." According to the report, it does sound like the image of Taki Allen holding a bag of Doritos could have looked like it was a gun, as apparently the cops showed him the picture where "it was two hands and one finger out, and they said it looked like a gun." The image in question hasn't been released to the public, so we can only assume what it might look like when a bag of Doritos is mistaken for a gun. Or the steps students need to take when eating a bag of chips to avoid being mistaken for an active shooter.
[11]
Big AI blunder? Cops cuff teen in Baltimore after AI security system detects gun in his bag; here's what happened next
Drama unfolded in Baltimore, a city in the US state of Maryland, after an AI security system spotted a teenager with a gun and apparently alerted the city police. Later, it turned out that it was only flagging a bag of cheesy goodness. The teenager at the center of the incident was identified as Taki Allen, a 16-year-old student. The incident was captured in body cam footage from Kenwood High School in Baltimore County, where Allen was seen getting patted down while handcuffed after an AI-powered gun detection system flagged him, believing he was carrying, according to TMZ. A search was carried out, and nothing was found. An officer on the body cam tried to break down and explain the entire incident, saying that the AI system was firing off the alarm because of "the way you guys were eating chips, Doritos, or whatever; it picked it up as a gun." A Baltimore County Police Department spokesperson told TMZ, "The officers responded appropriately and proportionally based on the information provided at the time. The incident was safely resolved after it was determined there was no threat." The school uses Omnilert. According to TMZ, it is a program that taps into existing school camera feeds to detect objects resembling weapons and then sends alerts to school safety officers and law enforcement. Speaking to the local news station WBAL-TV 11 News, Allen said that he had two hands on the chip bag with one finger out when the system flagged him. It was unclear if Omnialert thought the finger was a gun.
[12]
Teen swarmed by armed cops when school's AI security system mistakes...
A high-school football player was eating a bag of Doritos after practice last week when the school's AI security cameras mistook his snack for a gun -- prompting scores of cops to rush there with guns drawn. Taki Allen, 16, said he was waiting for his ride after Monday's grid practice at Kenwood High School in Baltimore when he finished up his chips, crumpled up the bag and was chatting with his friends -- then heard cops yelling at him while waving their weapons. "Police showed up, like eight cop cars, and then they all came out with guns pointed at me talking about getting on the ground. I was putting my hands up like, 'What's going on?' " Allen told WMAR-2 News. Artificial intelligence that reviews the school's security camera footage had flagged Allen's Doritos as a gun, alerting law enforcement, authorities said. Some officers quickly cuffed Allen, patting him down, while others searched his friends. "Do you have a gun on you?" one of the officers asked in bodycam footage obtained by CBS. Allen, appearing distressed and confused as the officer rifled through his pockets, replied, "What? No." Police officers then reviewed the footage that the AI had flagged, wandered over to a nearby trash can and discovered the bag of chips. "I guess just the way you were eating chips ... Doritos, whatever .... it picked it up as a gun," an officer said of the security system. Another cop said, "AI is not the best." The Oct. 20 incident has left Allen feeling on edge at school, the teen said. He said he no longer feels safe going outside after practice and waits inside the school now to avoid the cameras. "I don't want ... don't think I'm safe enough to go outside, especially eating a bag of chips or drinking something. I just stay inside until my ride comes," Allen said. School administrators have stood by the AI detection system, Omnilert, which is used by schools and law enforcement across the country. "In this case, the program did what it was supposed to do," said district Superintendent Myriam Rogers in a press conference Thursday. But Allen said he isn't so sure the system is working. "I don't think no chip bag should be mistaken for a gun at all," he said.
[13]
AI thought a Doritos bag was a gun. What's worse? We believed it!
A 16-year-old boy in Baltimore ended up handcuffed on the ground, surrounded by police with drawn weapons. Because of an empty Doritos bag. Rather, because AI thought that bag of chips was actually a gun! The boy, named Taki Allen, was waiting outside his school for a ride home, minding his own business. That's when a police AI system flagged a "weapon threat." Within minutes, officers arrived. Eight patrol cars, officers with weapons drawn, shouting commands at a terrified teenager who had no clue what was happening. Let that sink in for a second: a literal bag of chips almost got a teenager shot. Welcome to the golden age of artificial "intelligence," where the machines that were supposed to make us smarter are instead giving us some of the dumbest moments in modern tech history Ironically, this has happened in the same week as OpenAI - the torchbearer of AI revolution - has introduced its AI-powered browser ChatGPT Atlas. And, it just brings me back to the same point that AI is getting out of hand and somehow noone seems to be paying enough attention to it. AI hallucination has become the industry's worst-kept secret. Ask ChatGPT for sources, and it might invent research papers that don't exist. Use Midjourney, and it could render six-fingered hands like it's some cosmic inside joke. Plug a facial recognition algorithm into a city's CCTV network, and suddenly, an innocent person becomes a "suspect." It's inevitable. AI doesn't "see" or "think" the way we do. It's just a pattern machine, connecting dots in datasets, often without any real understanding of context. So when that surveillance AI in Baltimore mistook a chip bag for a gun, it wasn't being evil. It was being stupid. The problem is, we keep putting that stupidity in charge of things that matter. This wasn't the first time an AI hallucination had real-world fallout. In 2020, Detroit police wrongfully arrested a man named Robert Williams after an AI facial recognition system said his face matched a shoplifter's. Spoiler: it didn't. He spent 30 hours in custody before cops realised the machine was wrong. A few years back, Tesla's Autopilot mistook the bright sky for a white truck, leading to a deadly crash. And more recently, several "smart" image models were caught labeling darker-skinned people as "animals." It's not a glitch. It's how these systems work. They're trained on messy, biased data. They "learn" from the internet which has unfortunately become humanity's least reliable dataset. The real danger isn't AI itself. It's how much we trust it. Humans hesitate, doubt, double-check. Machines? Never. They answer with unshakeable confidence, even when they're dead wrong. And because that confidence sounds like competence, we believe them. Police get an alert from an AI system, and instead of verifying, they mobilise. A lawyer uses ChatGPT to write a case brief and it cites fake legal precedents. A student runs an AI detector on their essay and it falsely flags them for cheating. AI doesn't think. It guesses. And sometimes those guesses have real-world consequences. The scary part? These hallucinations are baked into the system. They're not bugs. This means no matter how much these models are fine-tuned, there will always be a certain level of nonsense. When it happens in healthcare, finance or law enforcement, it's dangerous. Take medical AI systems that have misread scans, flagging healthy patients as sick. Or self-driving cars that slam brakes for shadows. This blind faith in automation is becoming our default setting. We've confused machine-generated with objective. AI's tone is what makes it so dangerous. It doesn't hedge. It doesn't say, "I might be wrong." It speaks with certainty. That certainty makes it useful for small stuff: summaries, scripts, art ideas but catastrophic for decisions that impact real lives. The Baltimore incident wasn't the first such case. It surely wasn't the last. There will be more as organisations keep pushing AI into our lives. A machine mistaking a Doritos bag for a gun isn't just a story about bad image recognition. It's a warning about what happens when we hand over authority to systems that don't know the difference between a weapon and a snack. How do we stop that? I don't know. I don't think anyone has an answer at this point. We will have more nonsense come our way before rules are set, guidelines are made and a framework is put in place. Till, then let's just hope that next AI mistake doesn't end with a kid in cuffs or worse.
Share
Share
Copy Link
A high school student in Baltimore was handcuffed by police after an AI security system incorrectly identified his bag of Doritos as a potential weapon, highlighting concerns about AI accuracy in critical security applications.
A 16-year-old student at Kenwood High School in Baltimore County, Maryland, experienced a traumatic encounter with law enforcement after an artificial intelligence gun detection system mistakenly identified his bag of Doritos as a potential firearm
1
. Taki Allen was sitting outside with friends after football practice on October 20, waiting to be picked up, when the incident occurred2
.
Source: New York Post
Approximately 20 minutes after Allen had finished eating his chips, eight police cars arrived at the school in response to the AI alert . Officers emerged from their vehicles with weapons drawn, ordering Allen to put his hands on his head and walk toward them. The teenager was subsequently forced to his knees and handcuffed while police searched both him and his friends
5
.Allen described the confusion to local news outlets, explaining that he was "just holding a Doritos bag -- it was two hands and one finger out, and they said it looked like a gun"
1
. The error became apparent when police reviewed the surveillance footage and found the crumpled chip bag in a nearby trash can .
Source: Tom's Hardware
One officer acknowledged the system's limitations, telling Allen that "AI is not the best" and explaining that the way he was eating chips had triggered the detection system . The Baltimore County Police Department later confirmed that while Allen was handcuffed, he was not formally arrested .
The AI system in question was developed by Omnilert, a company that markets itself as a "pioneer in AI-powered active shooter prevention technology"
5
. According to the company's documentation, their gun detection system operates through a three-step process: AI detection, human verification, and automated emergency response2
.Principal Katie Smith explained in a statement to parents that the school's security department had actually reviewed and canceled the gun detection alert. However, Smith, who was unaware that the alert had been canceled, reported the situation to the school resource officer, who then contacted local police
1
. School Superintendent Dr. Myriam Rogers defended the system, stating that "the program did what it was supposed to do which was to signal an alert and for humans to take a look" .Related Stories
Omnilert issued a statement expressing regret over the incident while maintaining that "the process functioned as intended"
1
. The company told the BBC that while the object was later determined not to be a firearm, the system prioritized "safety and awareness through rapid human verification"4
.Source: TechSpot
The company's website promotes its technology as using "real, diverse data" leading to "more reliable detection, fewer false positives," claiming that their "data-centric methodology trains AI to succeed" in real-world scenarios
4
. However, this incident raises questions about the accuracy of such systems, particularly given that last year, another weapons scanning company, Evolv Technology, was banned from making unsupported claims about its AI scanner's capabilities4
.The incident has had a lasting psychological impact on Allen, who described feeling scared during the encounter and wondering "am I gonna die? Are they going to kill me?"
5
. The teenager now waits inside after football practice, stating he doesn't think it's "safe enough to go outside, especially eating a bag of chips or drinking something"4
.Local politicians have responded to the incident by calling for a review of the AI-powered surveillance technology. Baltimore County Councilman Izzy Pakota wrote on Facebook that he was "calling on Baltimore County Public Schools to review procedures around its AI-powered weapon detection system"
4
. Both Omnilert and the school have offered counseling services to students affected by the incident2
.Summarized by
Navi
[1]
[2]
[3]
24 Jan 2025•Technology

17 Oct 2025•Technology

12 Nov 2025•Entertainment and Society

1
Business and Economy

2
Technology

3
Policy and Regulation
