5 Sources
5 Sources
[1]
Cops pause use of flawed AI cameras secretly monitoring streets for suspects
New Orleans police have reportedly spent years scanning live feeds of city streets and secretly using facial recognition to identify suspects in real time -- in seeming defiance of a city ordinance designed to prevent false arrests and protect citizens' civil rights. A Washington Post investigation uncovered the dodgy practice, which relied on a private network of more than 200 cameras to automatically ping cops' phones when a possible match for a suspect was detected. Court records and public data suggest that these cameras "played a role in dozens of arrests," the Post found, but most uses were never disclosed in police reports. That seems like a problem, the Post reported, since a 2022 city council ordinance required much more oversight for the tech. Rather than instantly detaining supposed suspects the second they pop up on live feeds, cops were only supposed to use the tech to find "specific suspects in their investigations of violent crimes," the Post reported. And in those limited cases, the cops were supposed to send images to a "fusion center," where at least two examiners "trained in identifying faces" using AI software had to agree on alleged matches before cops approached suspects. Instead, the Post found that "none" of the arrests "were included in the department's mandatory reports to the city council." And at least four people arrested were charged with nonviolent crimes. Some cops apparently found the city council process too sluggish and chose to ignore it to get the most out of their access to the tech, the Post found. Now, New Orleans police have paused the program amid backlash over what Nathan Freed Wessler, the deputy director of the American Civil Liberties Union (ACLU) Speech, Privacy, and Technology Project, suggested might be the sketchiest use of facial recognition yet in the US. He told the Post this is "the first known widespread effort by police in a major US city to use AI to identify people in live camera feeds for the purpose of making immediate arrests." New Orleans Police Department superintendent Anne Kirkpatrick told the Post that she would be conducting a review of the program and turning off all automated alerts until she is "sure that the use of the app meets all the requirements of the law and policies." The ACLU is demanding a stronger response, asking for a full investigation into how many arrests were made and urging NOPD to permanently stop using the AI-enhanced feeds. In a statement sent to Ars, Alanah Odoms, the executive director of the ACLU of Louisiana, said that without a full investigation, there would be no way to know the extent of potential harms of the secret AI surveillance to the community. "We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies," Odoms said. "These individuals could be added to Project Nola's watchlist without the public's knowledge and with no accountability or transparency on the part of the police departments." Some states ban using facial recognition for immediate arrests The cameras in New Orleans are operated by Project Nola, a nonprofit founded by a former cop, Bryan Lagarde, who wanted to help police more closely monitor the city's "crime-heavy areas," the Post reported. Configured to scan live footage for people "on a list of wanted suspects," the camera network supposedly assisted in at least 34 arrests since 2023, Project Nola has claimed in social media posts. But the Post struggled to verify that claim, since "the city does not track such data and the nonprofit does not publish a full accounting of its cases." According to police records submitted to the city council, the network "only proved useful in a single case." Investigating the tension between these claims, the Post suggested we may never know how many suspects were misidentified or what steps police took to ensure responsible use of the controversial live feeds. In the US, New Orleans stands out for taking a step further than law enforcement in other regions by using live feeds from facial recognition cameras to make immediate arrests, the Post noted. The Security Industry Association told the Post that four states -- Maryland, Montana, Vermont, and Virginia -- and 19 cities nationwide "explicitly bar" the practice. Lagarde told the Post that police cannot "directly" search for suspects on the camera network or add suspects to the watchlist in real time. Reese Harper, an NOPD spokesperson, told the Post that his department "does not own, rely on, manage, or condone the use by members of the department of any artificial intelligence systems associated with the vast network of Project Nola crime cameras." In a federally mandated 2023 audit, New Orleans police complained that complying with the ordinance took too long and "often" resulted in no matches. That could mean the tech is flawed, or it could be a sign that the process was working as intended to prevent wrongful arrests. The Post noted that in total, "at least eight Americans have been wrongfully arrested due to facial recognition," as both police and AI software rushing arrests are prone to making mistakes. "By adopting this system-in secret, without safeguards, and at tremendous threat to our privacy and security-the City of New Orleans has crossed a thick red line," Wessler said. "This is the stuff of authoritarian surveillance states and has no place in American policing." Project Nola did not immediately respond to Ars' request to comment.
[2]
NOLA PD halts facial recognition alerts from private cams
Since early 2023, facial recognition cameras run by a private nonprofit have scanned New Orleans visitors and residents and quietly alerted police, sidestepping oversight and potentially violating city law, according to a new report. In 2022, the Big Easy's city government relaxed its ban on the use of facial recognition technology. It could be used to investigate violent crimes, but had to be checked by a human operator before action was taken. But an investigation published Monday by the Washington Post found that within a year, police were quietly receiving continuous real-time facial recognition alerts from a privately operated camera network. These alerts came from cameras managed by nonprofit Project NOLA, which runs a sprawling, privately funded surveillance network across the city, the report says. Project NOLA claims access to more than 5,000 camera feeds in the New Orleans area, with over 200 equipped for facial recognition. The system compares faces against a privately compiled database of more than 30,000 individuals, assembled partly from police mugshots. When a match is detected, officers receive a mobile phone alert with the person's identity and location, according to the report. The police were required to notify the city council each time they used facial recognition technology in an investigation or arrest, but reportedly failed to do so. In multiple cases, police reports omitted any mention of the technology, raising concerns that defendants were denied the opportunity to challenge the role facial recognition played in their arrest. By adopting this system - in secret, without safeguards, and at tremendous threat to our privacy and security - the City of New Orleans has crossed a thick red line As scrutiny mounted, the police department distanced itself from the operation, saying in a carefully worded statement that it "does not own, rely on, manage, or condone the use by members of the department of any artificial intelligence systems associated with the vast network of Project NOLA crime cameras." "Until now, no American police department has been willing to risk the massive public blowback from using such a brazen face recognition surveillance system," said Nathan Freed Wessler, deputy director of ACLU's Speech, Privacy, and Technology Project, in a press release. "By adopting this system - in secret, without safeguards, and at tremendous threat to our privacy and security - the City of New Orleans has crossed a thick red line. This is the stuff of authoritarian surveillance states, and has no place in American policing." Safeguards are there for a reason, as past cases have already shown. In 2022, Randall Reid was arrested in Georgia after Louisiana deputies used Clearview AI to match his driver's license photo to surveillance footage from a purse theft, despite his claim that he had never been to the state. He spent six days in jail, incurred thousands in legal fees, and in 2023 filed a federal lawsuit alleging wrongful arrest based solely on a facial recognition match. In 2020, Detroit police made headlines when they falsely identified and arrested Robert Williams on suspicion of being a shoplifter - Williams later testified to Congress about the experience. A year later, it was the turn of Lamya Robinson, then 14, who was ejected from a roller rink after being falsely pegged with "97 percent match" to a known troublemaker. Cases like these helped fuel public backlash and legislative efforts to rein in facial recognition technology. New Orleans was no exception, banning the tech in 2020. But the 2022 ruling relaxed the rules slightly to allow its use via the Louisiana Fusion Center, which aggregates data from police across the state. At the time, police assured city officials the technology would only be used as a last resort after other identification methods failed. Sergeant David Barnes testified that any request required supervisory approval and that matches had to be reviewed by multiple staff members before being acted upon. Project NOLA wasn't mentioned, and it's possible police believed that receiving alerts from a private system exempted them from the rules. The nonprofit certainly has the hardware to support real-time surveillance - its website promotes AI-enabled cameras, offered free with installation fees, and cloud storage plans. An outlay of $300 a year gets you a basic camera system, while $2,200 covers a high-end 4K model with 25x zoom, STARVIS night vision, and AI that automatically tracks people and vehicles, flashing red and blue lights and a spotlight when it detects intruders or suspicious activity. Footage is typically stored for 30 days, though that window has been extended to 90 days in some districts following recent policy changes. The Post investigators started firing off questions to the police and the city in February. On April 8, NOPD boss Anne Kirkpatrick reportedly sent out an all-hands memo to staff, saying that an officer had raised concerns about the system and suspended its use. She wrote that Project NOLA had been asked to suspend alerts to officers until she was "sure that the use of the app meets all the requirements of the law and policies." Neither NOPD or Project NOLA had any comment at time of going to press. ®
[3]
Police secretly monitored New Orleans with facial recognition cameras
Following records requests from The Post, officials paused the first known, widespread live facial recognition program used by police in the United States. NEW ORLEANS -- For two years, New Orleans police secretly relied on facial recognition technology to scan city streets in search of suspects, a surveillance method without a known precedent in any major American city that may violate municipal guardrails around use of the technology, an investigation by The Washington Post has found. Police increasingly use facial recognition software to identify unknown culprits from still images, usually taken by surveillance cameras at or near the scene of a crime. New Orleans police took this technology a step further, utilizing a private network of more than 200 facial recognition cameras to watch over the streets, constantly monitoring for wanted suspects and automatically pinging officers' mobile phones through an app to convey the names and current locations of possible matches. This appears out of step with a 2022 city council ordinance, which limited police to using facial recognition only for searches of specific suspects in their investigations of violent crimes and never as a more generalized "surveillance tool" for tracking people in public places. Each time police want to scan a face, the ordinance requires them to send a still image to trained examiners at a state facility and later provide details about these scans in reports to the city council -- guardrails meant to protect the public's privacy and prevent software errors from leading to wrongful arrests. Since early 2023, the network of facial recognition cameras has played a role in dozens of arrests, including at least four people who were only charged with nonviolent crimes, according to police reports, court records and social media posts by Project NOLA, a crime prevention nonprofit company that buys and manages many of the cameras. Officers did not disclose their reliance on facial recognition matches in police reports for most of the arrests for which the police provided detailed records, and none of the cases were included in the department's mandatory reports to the city council on its use of the technology. Project NOLA has no formal contract with the city, but has been working directly with police officers. "This is the facial recognition technology nightmare scenario that we have been worried about," said Nathan Freed Wessler, a deputy director with the ACLU's Speech, Privacy, and Technology Project, who has closely tracked the use of AI technologies by police. "This is the government giving itself the power to track anyone -- for that matter, everyone -- as we go about our lives walking around in public." Anne Kirkpatrick, who heads the New Orleans Police Department, paused the program in early April, she said in an interview, after a captain identified the alerts as a potential problem during a review. In an April 8 email reviewed by The Post, Kirkpatrick told Project NOLA that the automated alerts must be turned off until she is "sure that the use of the app meets all the requirements of the law and policies." The Post began requesting public records about the alerts in February. The police department "does not own, rely on, manage, or condone the use by members of the department of any artificial intelligence systems associated with the vast network of Project Nola crime cameras," Reese Harper, a spokesman for the agency, said in an emailed statement. Police across the country rely on facial recognition software, which uses artificial intelligence to quickly map the physical features of a face in one image and compare it to the faces in huge databases of images -- usually drawn from mug shots, driver's licenses or photos on social media -- looking for possible matches. New Orleans's use of automated facial recognition has not been previously reported and is the first known widespread effort by police in a major U.S. city to use AI to identify people in live camera feeds for the purpose of making immediate arrests, Wessler said. The Post has reported that some police agencies use AI-powered facial recognition software in violation of local laws, discarding traditional investigative standards and putting innocent people at risk. Police at times arrested suspects based on AI matches without independent evidence connecting them to the crime, raising the chances of a false arrest. Often, police failed to inform defendants about their use of facial recognition software, denying them the opportunity to contest the results of a technology that has been shown to be less reliable for people of color, women and older people. One of the few places where live facial recognition is known to be in wide use is London, where police park vans outside of high-traffic areas and use facial recognition-equipped cameras to scan the faces of passersby, and confront people deemed a match to those on a watch list. While the city says the program has never led to a false arrest since launching in 2016, Big Brother Watch, a London-based civil liberties group, argues that the practice treats everyone as a potential suspect, putting the onus on the people who were falsely matched to prove their innocence. Real-time alerts The surveillance program in New Orleans relied on Project NOLA, a private group run by a former police officer who assembled a network of cameras outside of businesses in crime-heavy areas including the city's French Quarter district. Project NOLA configured the cameras to search for people on a list of wanted suspects. When the software determined it had found a match, it sent real-time alerts via an app some officers installed on their mobile phones. The officers would then quickly research the subject, go to the location and attempt to make arrests. Police did not set up the program nor can they directly search for specific people, or add or remove people from the camera system's watch list, according to Bryan Lagarde, Project NOLA's founder. Little about this arrangement resembles the process described in the city council ordinance from three years ago, which imagined detectives using facial recognition software only as part of methodical investigations with careful oversight. Each time police want to scan a face, the ordinance requires them to send a still image to a state-run "fusion center" in Baton Rouge, where various law enforcement agencies collaborate on investigations. There, examiners trained in identifying faces use AI software to compare the image with a database of photos and only return a "match" if at least two examiners agree. Investigators have complained that process takes too long and often doesn't result in any matches, according to a federally mandated audit of the department in 2023. It has only proved useful in a single case that led to an arrest since October 2022, according to records police provided to the city council. By contrast, Project NOLA claims its facial recognition cameras played a role in at least 34 arrests since they were activated in early 2023, according to the group's Facebook posts -- a number that cannot be verified because the city does not track such data and the nonprofit does not publish a full accounting of its cases. Without a list of the cases, it's impossible to know whether any of the people were misidentified or what additional steps the officers took to confirm their involvement in the crimes. Kirkpatrick said her agency has launched a formal review into how many officers used the real-time alerts, how many people were arrested as a result, how often the matches appear to have been wrong and whether these uses violated the city ordinance. "We're going to do what the ordinance says and the policies say, and if we find that we're outside of those things, we're going to stop it, correct it and get within the boundaries of the ordinance," she said. There are no federal regulations around the use of AI by local law enforcement. Four states -- Maryland, Montana, Vermont and Virginia -- as well as at least 19 cities in nine other states explicitly bar their own police from using facial recognition for live, automated or real-time identification or tracking, according to the Security Industry Association, a trade group. Lawmakers in these places cited concerns in public meetings that the technology could infringe on people's constitutional rights or lead police to make mistakes when they rush to arrest a potential suspect before taking steps to confirm their connection to the crime, as many people look alike. At least eight Americans have been wrongfully arrested due to facial recognition, The Post and others have reported. The unsanctioned surveillance program in New Orleans highlights the challenge of regulating a technology that is widely available, at a time when some police see AI as an invaluable crime fighting tool. Even in some places where officials have banned facial recognition, including Austin and San Francisco, officers skirted the bans by covertly asking officers from neighboring towns to run AI searches on their behalf, The Post reported last year. Violent crime rates in New Orleans, like much of the country, are at historic lows, according to Jeff Asher, a consultant who tracks crime statistics in the region. But city officials have seized on recent instances of violent crime to argue that police need the most powerful tools at their disposal. Cool shot from a Project NOLA Drone of a Project NOLA Mobile Camera Trailer helping to secure UNO's annual Parade this...Posted by ProjectNOLA on Tuesday, February 25, 2025 Last month, an independent report commissioned after the New Year's Day attack that left 14 people dead on Bourbon Street found the New Orleans police to be understaffed and underprepared. The report, overseen by former New York City police commissioner William Bratton, advised New Orleans to explore adopting several new tools, including drones, threat prediction systems and upgrades to the city's real-time crime center -- but did not recommend adding any form of facial recognition. Kirkpatrick, the city's top police official, and Jason Williams, its top prosecutor, both said they are in discussions with the city council to revise the facial recognition ordinance. Kirkpatrick says she supports the idea of the city legally operating its own live facial recognition program, without the involvement of Project NOLA and with certain boundaries, such as prohibiting use of the technology to identify people at a protest. "Can you have the technology without violating and surveilling?" she asked. "Yes, you can. And that's what we're advocating for." 5,000 cameras Few people have as much visibility into the everyday lives of New Orleans residents as Lagarde, a former patrol officer and investigator who started his own video surveillance business in the late 1990s before launching Project NOLA in 2009. Funded by donations and reliant on businesses that agree to host the cameras on their buildings or connect existing surveillance cameras to its centralized network, Lagarde said Project NOLA has access to 5,000 crime cameras across New Orleans, most of which are not equipped with facial recognition. The cameras all feed into a single control room in a leased office space on the University of New Orleans campus, Lagarde said in an interview at the facility. Some camera feeds are also monitored by federal, state and local law enforcement agencies, he said. Project NOLA made $806,724 in revenue in 2023, tax filings show. Much of it came from "cloud fees" the group charges local governments outside of New Orleans -- from Monticello, Florida, to Frederick, Colorado -- which install Project NOLA cameras across their own towns and rely on Lagarde's assistance monitoring crime. He's experimented with facial recognition in Mississippi, he said, but his "first instance of doing citywide facial recognition is New Orleans." New Orleans does not pay Project NOLA. For more than a decade, Lagarde used standard cameras outside businesses to monitor crime and offer surveillance clips for officers to use in their investigations. Lagarde's cameras became so widespread that police began calling him when they spotted a Project NOLA camera hovering near a crime scene they were investigating, according to police incident reports, interviews with police and emails obtained through a public records request. Lagarde began adding facial recognition cameras to his network in early 2023, after an $87,000 bequest from a local woman. Lagarde used the money to buy a batch of cameras capable of detecting people from about 700 feet away and automatically matching them to the facial features, physical characteristics and even the clothing of people in a database of names and faces he has compiled. Lagarde says he built his database partly from mug shots from local law enforcement agencies. It includes more than 30,000 "local suspected and known criminals," Project NOLA wrote on Facebook in 2023. Lagarde can quickly identify anyone in the database the moment they step in front of a Project NOLA camera, he said. He can also enter a name or image to pull up all the video clips of that person Project NOLA captured within the last 30 days, after which Lagarde says videos get automatically deleted "for privacy reasons." Project NOLA found enthusiastic partners in local business owners, some of who were fed up with what they saw as the city's inability to curb crime in the French Quarter -- the engine of its tourism economy that's also a hub for drug dealers and thieves who prey on tourists, said Tim Blake, the owner of Three Legged Dog, a bar that was one of the first places to host one of Project NOLA's facial recognition cameras. "Project NOLA would not exist if the government had done its job," Blake said. While Lagarde sometimes appears alongside city officials at news conferences announcing prominent arrests, he is not a New Orleans government employee or contractor. Therefore, Lagarde and the organization are not required to share information about facial recognition matches that could be critical evidence in the courtroom, said Danny Engelberg, the chief public defender for New Orleans. "When you make this a private entity, all those guardrails that are supposed to be in place for law enforcement and prosecution are no longer there, and we don't have the tools to do what we do, which is hold people accountable," he said. Lagarde says he tries to be transparent by posting about some of his successful matches on Facebook, though he acknowledges that he only posts a small fraction of them and says it would be "irresponsible" to post information about open investigations. Project NOLA, he added, is accountable to the businesses and private individuals who host the cameras and voluntarily opt to share their feeds with the network. "It's a system that can be turned off as easily as it's been turned on," he said. "Were we to ever violate public trust, people can individually turn these cameras off." Banned devices Lagarde declined to say who makes the equipment he uses, saying he doesn't want to endorse any company. Several Project NOLA cameras in the French Quarter look nearly identical to ones on the website of Dahua, a Chinese camera maker, and product codes stamped on the backs of these devices correspond to an identical camera sold by Plainview, New York-based equipment retailer ENS Security, which has acknowledged reselling Dahua cameras in the past. Project NOLA's website also contains a link to download an app where police officers can view and manage footage. The app, called DSS, is made by Dahua. Congress banned federal agencies from using products or services made by Dahua and a list of other Chinese companies in 2018, citing concerns that the equipment could be used by President Xi Jinping's government to spy on Americans. Since 2020, the law has barred any agency or contractor that receives federal funds from using those funds on the banned products. A Dahua spokesperson declined to comment on the New Orleans cameras and said the company stopped selling equipment in the U.S. last year. The New Orleans Police Department has received tens of millions of dollars from the federal government in recent years and confirmed that some officers have installed this DSS app on mobile phones and police workstations. Kirkpatrick said she was not aware of who made the app or cameras but would look into it. Lagarde said Project NOLA uses "American-made, brand-name servers to operate our camera program." Some city officials argue that police are not violating the city's facial recognition ordinance because they do not own the cameras or contract with Lagarde; they are merely receiving tips from an outside group that is performing facial recognition scans on its own. "If Bryan Lagarde calls an officer and says 'I think a crime is occurring on the 1800 Block of Bienville,' that's no different than Miss Johnson looking out of her window and saying 'I think a crime is occurring on 1850 Bienville,'" Williams, the Orleans Parish district attorney, said in an interview. But in many cases, police have gone to Lagarde to request footage or help identifying and locating suspects, according to police reports, Project NOLA social media posts and internal police emails. Good News! The NOPD recovered your stolen car and Project NOLA helped to ID the perps! This marks the 3rd stolen...Posted by ProjectNOLA on Thursday, July 13, 2023 Tracking a suspect In one case last year, a police detective investigating a snatched cellphone relied on Project NOLA to identify the perpetrator and track him down using facial recognition alerts, according to accounts of the investigation drawn partly from the police incident report and partly from Project NOLA's Facebook post. The detective contacted Lagarde "to assist locating the perpetrator on Project NOLA cameras," according to the police report, providing still shots taken from the city's surveillance camera footage. Lagarde used Project NOLA's clothing recognition tool to find previous video footage of a suspect. With the new, better images of his face, Project NOLA used facial recognition to learn his possible identity and share that with the detective. The detective took that name and found photos of a man on social media whose appearance and tattoos matched the phone-snatcher. Police got a warrant for his arrest. Lagarde added that name and face to Project NOLA's watch list, and a few days later, cameras automatically identified him in the French Quarter and alerted police, who found and arrested him. The man was charged with robbery but pleaded guilty to the lesser offense of theft, court records show. The police report mentioned that Lagarde helped identify the suspect, but did not mention that he used facial recognition to do so or used live facial recognition and automated alerts to monitor for and locate him. David Barnes, a New Orleans police sergeant overseeing legal research and planning, said officers are trained to always find probable cause before making an arrest. He said Lagarde sometimes overstates in Facebook posts the role his technology played in some of the cases. He said the detective investigating the phone-snatching case was only asking Lagarde to find videos of the suspect, not the location of the suspect. On a rainy May morning outside the Three Legged Dog, a Project NOLA camera swiveled about, blinking red and blue lights, and twitching side to side as it followed cars and people based on an automated program. The camera is no longer pinging the police on an app -- at Kirkpatrick's request. "Like you and everybody else, I do not want to lose any cases of violent criminals based on policy violations or violations of our ordinances," Kirkpatrick said in her email last month to Lagarde. But the alerts still go to Project NOLA staff, who Lagarde said convey the location of wanted suspects to the police via phone calls, texts and emails. Schaffer reported from Washington. Nate Jones and Jeremy Merrill contributed to this report.
[4]
New Orleans may be 1st U.S. city to use live AI facial recognition cameras
Why it matters: It's likely the first AI-enhanced live surveillance system to be used in a major American city, the paper says of the investigation. The big picture: New Orleans police have been using information from Project NOLA's 200-plus camera network to find wanted individuals for at least the past two years, Douglas MacMillan and Aaron Schaffer wrote in the story. Case in point: NOPD Superintendent Anne Kirkpatrick said the department used the facial recognition technology Friday to identify one of the 10 inmates who escaped from Orleans Justice Center earlier that day. Between the lines: Project NOLA doesn't have a formal contract with the city, Lagarde tells Axios. Individual officers download the app and sign up for alerts, he says. Flashback: A 2022 City Council ordinance regulates how the city uses facial recognition software, according to reporting by The Lens. What's next: Kirkpatrick told the Washington Post the department is doing a formal review of how many officers used the alerts, what arrests were made and whether this violated the ordinance.
[5]
Facial recognition technology use in search for New Orleans jail escapees under scrutiny
During the ongoing massive manhunt for 10 inmates who escaped from a New Orleans jail last week, authorities say the use of facial recognition cameras run by a private organization helped lead to the recapture of one of the fugitives -- even as the police department has come under scrutiny by critics from civil rights organizations to conservative politicians over its use of the technology. Earlier this week, New Orleans Police Department Superintendent Anne Kirkpatrick told ABC News that facial recognition cameras maintained by Project N.O.L.A. had been used in the New Orleans manhunt despite the fact that she recently ordered a pause in the automated alerts her officers had been receiving from the group, which operates independently of the police department. Kirkpatrick recently told The Washington Post she ordered the alerts to officers turned off until she is "sure that the use of the app meets all the requirements of the law and policies." Citing the New Orleans Police Department's partnership with Project N.O.L.A., the American Civil Liberties Union said in a statement it is believed to be the first known widespread effort by a major American law enforcement agency to use artificial intelligence technology to identify suspects in an assortment of crimes across the city. In a statement, the ACLU said the use of live facial recognition raises constitutional and privacy issues and "is a radical and dangerous escalation of the power to surveil people as we go about our daily lives." Critics of the New Orleans Police Department's use of facial recognition cameras said that the average citizen should understand that they are not opting in or are being made aware that they are being scanned by the cameras. "Facial recognition technology poses a direct threat to the fundamental rights of every individual and has no place in our cities," Alanah Odoms, executive director of the ACLU of Louisiana, said in a statement about the city's partnership with Project N.O.L.A. "We call on the New Orleans Police Department and the City of New Orleans to halt this program indefinitely and terminate all use of live-feed facial recognition technology." Some Republicans in Congress also opposed the unchecked use of the technology, most notably Reps. Jim Jordan of Ohio, Andy Biggs of Arizona, Warren Davidson of Ohio, Thomas Massie of Kentucky and Sens. Mike Lee of Utah and Steve Daines of Montana. In a March 27, 2025 letter to Kash Patel, who was then acting director of the federal Bureau Alcohol, Tobacco, Firearms Explosives, Biggs, the chairman of the House Subcommittee on Crime and Federal Government Surveillance, and Davidson raised concerns over news reports indicating the ATF utilized facial recognition technology to identify gun owners. "The Subcommittee has concerns about ATF's use of facial recognition and Al programs and the effects that its use has upon American citizens' Second Amendment rights and rights to privacy," the lawmakers wrote in their letter, requesting documents on policies and training in the use of facial recognition technology. Democrats, including Rep. Zoe Lofgren of California and Sen. Ron Wyden of Oregon have also joined bipartisan efforts to curtail the use of such surveillance. The 10 inmates escaped from the Orleans Justice Center in New Orleans on May 16, officials said. Five of the fugitives have since been recaptured, leaving five others, including three charged with murder, still on the run as of Thursday afternoon. Kirkpatrick told ABC News this week that one of the fugitives was caught and another narrowly got away after live facial recognition cameras operated by Project N.O.L.A. located them while scanning crowds in the French Quarter. Bryan Lagarde, executive director of Project N.O.L.A., told ABC News that after being notified of the jailbreak on Friday, state police gave his group a list of the escapees. "We put that into our facial recognition. It took approximately four minutes to do that and within, literally, less than a minute later we started tracking two of the escapees," Lagarde said. He said the information about fugitive Kendall Myles and another escaped inmate, who he said is facing attempted second-degree murder charges, was sent to state police investigators who confirmed the two men were part of the jailbreak. "Then they immediately went out to the French Quarter, which is where we were tracking them walking down Bourbon Street," Lagarde said. Myles was arrested after police found him hiding under a car. The second escapee, however, managed to get away. "I'm sure they knew there were cameras because they were walking around with their faces held down and things like that. All it takes is just a second for them to look up and then there's facial recognition," Lagarde said. Citing the ongoing investigation, Lagarde declined to say if his cameras have located any of the other escapees. Largarde said that his organization has been using live facial recognition cameras in New Orleans for the past two years. In response to potential privacy concerns, Lagarde said, "As far as the facial recognition is concerned, it's scanning your face, my face, everyone's faces. If you're wanted and we know that you're wanted, you're going to be in trouble. If you are not wanted, its going to instantly disregard your face and just move on to the next person." He said his group maintains about 5,000 cameras in New Orleans, including 200 that have facial recognition capabilities. He said the facial recognition cameras not only scan faces, but also clothing, vehicle and license plates. "We work a very large number of the major crimes here in New Orleans: Homicides, shootings, stabbings, home invasions, rapes, robberies all the way down to the thefts and the burglaries," Lagarde said. Project N.O.L.A. works with the New Orleans Police Department and the Louisiana State Police but does not have an official contract with either agency, officials said. Before the manhunt, the New Orleans police had appeared to distance themselves from Project N.O.L.A.. The police department "does not own, rely on, manage, or condone the use by members of the department of any artificial intelligence systems associated with the vast network" of Project N.O.L.A.'s cameras, a spokesman for the police department agency said in a statement to The Washington Post. Kirkpatrick, the New Orleans police superintendent, said her agency has operated surveillance cameras across the city, many in the entertainment districts, but none of them have facial recognition capabilities. According to the New Orleans Police & Justice Foundation, the city has about 3,600 police operated cameras across the city. While the city has an ordinance on the use of facial recognition technology, Kirkpatrick said there are exceptions to the rules. "Sometimes, people think that we have a total ban on the use of facial recognition and that is not quite accurate," Kirkpatrick said. "There are exceptions, and I think that this one would meet the exception of those ordinances." According to the city ordinance, "Evidence obtained from facial recognition alone shall not be sufficient to establish probable cause for the purpose of effectuating an arrest by the NOPD or another law enforcement agency. The source of the image and the underlying reasons for the requested use of facial recognition systems as an investigative lead shall be documented in a police report." The ordinance says "facial-recognition technology, shall not be used as a surveillance tool." But the ordinance also states that "nothing in this section shall prohibit NOPD from requesting the use of facial recognition technology in the investigation of the prior occurrence of the following significant crimes as defined in Louisiana Revised Statute," including murder, manslaughter, solicitation of murder, first-degree robbery, drive-by shootings and carjackings. "They had my permission, that's for sure," Kirkpatrick said of the use of facial recognition technology in the manhunt. Three of the five escaped inmates still being sought on Friday have been have been charged with murder or attempted murder, including one who was convicted in a double homicide, authorities. The Washington Post investigation published this week reported that New Orleans police were using Project N.O.L.A.'s network of facial recognition cameras to monitor the streets for wanted suspects over the past two years in ways that appeared "out of step" with the local ordinance. In the interview with ABC about the manhunt, Kirkpatrick said that Project N.O.L.A. is a "useful partner" but stressed that it is not law enforcement and is not bound by the local ordinance, raising issues of accountability about Project N.O.L.A. and the data it collects on ordinary citizens who are being surveilled in this untargeted manner. "I'm very supportive of any technology that we can use to bring violent people back in, and then we can deal with the issues later, but we actually operate within the boundaries of the law," she said. "As long as it's constitutional, ethical, we're going to stay within the boundaries. But this is a bigger topic and discussion, mainly for our politicians to decide what kind of laws they want." Other police departments across the country have faced questions over their use of the technology. The use of facial recognition software by U.S. businesses has also grown sharply in recent years, analysts and privacy advocates told ABC News. The uses range from tech companies securing personal devices and retailers scanning for potential shoplifters to e-commerce giants tracking delivery drivers. Retailers are also using facial recognition scanning on shoppers to adjust pricing in stores. Companies contend that the technology helps them achieve a safe and efficient operation, benefiting consumers and employees alike. Critics say the powerful tool encroaches on the privacy of everyday people, risking undue punishment or discrimination, the experts said. Jake Laperruque, deputy director of the Center for Democracy & Technology's Security and Surveillance Project, said facial recognition cameras are an "unproven, error-prone tool." "This is the first documented case in the U.S. of police using untargeted facial recognition, which countries like China employ to track people across cities and surveil their Uyghur citizens," Laperruque said in a statement to ABC News regarding New Orleans' police use of the technology. "This kind of dragnet system belongs in a dystopian sci-fi movie, not in American cities. Average pedestrians shouldn't have to worry that untested AI technology will set off alarm bells and send police after them." One of the key issues of facial recognition and AI is that studies have shown that it can be racially biased and is particularly error prone with people of color, older people and women. "There's been error rates between 80 and 90%. That means nine out of every ten times that the system says, 'Hey, here's someone from our watch list,' it's actually a false alarm," Laperruque said of the use of these cameras as untargeted or real-time surveillance tools based on pilot programs run in the United Kingdom. "Facial recognition could be used to catalog attendees at a protest or political rallies of any affiliation, individuals going to a church, people visiting a medical clinic, or an array of other sensitive activities," Laperruque told ABCNews. He added, "Given these risks it's no surprise that surveillance reform in general -- and placing guardrails on facial recognition in particular -- has support from across the political spectrum, including some of the most progressive and conservative members of Congress -- just last month at a Congressional hearing conservative members of Congress highlighted the dangers of facial recognition and other unchecked forms of surveillance."
Share
Share
Copy Link
New Orleans police have suspended the use of a private AI-powered facial recognition camera network that secretly monitored city streets for suspects, raising concerns about privacy violations and potential misuse of the technology.
In a groundbreaking development, the New Orleans Police Department (NOPD) has suspended its use of a controversial AI-powered facial recognition system that had been secretly monitoring city streets for suspects. This program, believed to be the first of its kind in a major U.S. city, has raised significant concerns about privacy violations and potential misuse of technology
1
2
.Source: ABC News
The surveillance network, operated by Project NOLA, a nonprofit organization founded by former police officer Bryan Lagarde, consists of over 200 cameras equipped with facial recognition capabilities
1
. These cameras were configured to scan live footage for individuals on a list of wanted suspects, automatically alerting officers' mobile phones when a possible match was detected3
.Project NOLA claims to have access to more than 5,000 camera feeds in the New Orleans area, with a database of over 30,000 individuals compiled partly from police mugshots
2
. The system reportedly played a role in dozens of arrests since 2023, including at least four people charged with nonviolent crimes1
.The use of this technology appears to conflict with a 2022 city council ordinance that limited police use of facial recognition to specific investigations of violent crimes
1
. The ordinance required police to send images to a "fusion center" for verification by trained examiners before approaching suspects, a process that some officers reportedly found too slow1
3
.Nathan Freed Wessler, deputy director of the ACLU's Speech, Privacy, and Technology Project, described the program as "the facial recognition technology nightmare scenario" and "the stuff of authoritarian surveillance states"
3
4
. The ACLU is calling for a full investigation into the extent of the program's use and its potential harms to the community1
.Related Stories
NOPD Superintendent Anne Kirkpatrick has paused the program and ordered a review to ensure compliance with laws and policies
1
2
. The police department has distanced itself from the operation, stating that it "does not own, rely on, manage, or condone the use by members of the department of any artificial intelligence systems associated with the vast network of Project NOLA crime cameras"2
.This case highlights broader concerns about the use of facial recognition technology in law enforcement. Past incidents of wrongful arrests based on facial recognition matches, such as those of Randall Reid in Georgia and Robert Williams in Detroit, underscore the potential for errors and civil rights violations
2
5
.The use of live facial recognition raises constitutional and privacy issues, with critics arguing that it treats everyone as a potential suspect without their knowledge or consent
3
5
. The ACLU of Louisiana has called for an indefinite halt to the program and the termination of all use of live-feed facial recognition technology5
.As the debate over the balance between public safety and privacy rights continues, the New Orleans case serves as a critical example of the challenges and controversies surrounding the implementation of AI-powered surveillance technologies in law enforcement.
Summarized by
Navi
[2]
[3]
07 Oct 2024•Technology
13 Dec 2024•Policy and Regulation
16 Jul 2024
1
Business and Economy
2
Technology
3
Business and Economy