Curated by THEOUTPOST
On Mon, 26 Aug, 4:03 PM UTC
11 Sources
[1]
US police officers experiment using AI chatbots to write crime reports
US police are using a technology that records sound on their body cameras to write up their incident reports in eight seconds. Some police departments in the US are experimenting with using artificial intelligence (AI) chatbots to produce the first drafts of their incident reports. One technology with the same generative AI model as ChatGPT pulls the sound and radio chatter from a microphone on the police's body camera and can churn out a report in eight seconds. "It was a better report than I could have ever written and it was 100 per cent accurate. It flowed better," said Matt Gilmore, a police sergeant with the Oklahoma City Police. The new tool could be part of an expanding AI toolkit that US police are already using, such as algorithms that read license plates, recognise suspect's faces, or detect gunshots. Rick Smith, CEO and founder of Axon, the company behind the AI product called Draft One, said the AI has the possibility of eliminating the paperwork that police need to do so they have more time to do the work they want to do. But, like other AI technologies being used by police, Smith acknowledged there are concerns. He said they mainly come from district attorneys who want to make sure police officers know what's in their report in case they have to testify in a criminal proceeding about what they've seen at the crime scene. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. The introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. In Oklahoma City, they showed the tool to local prosecutors who advised caution before using it on high-stakes criminal cases. But there are examples of cities elsewhere in the US where officers can use the technology on any case or as they see fit. Legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms of this technology before it comes into force. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods to a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty". It's sometimes the only testimony a judge sees, especially for misdemeanour crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist aurelius francisco finds "deeply troubling" about the new tool, which he learned about from the AP. He said automating those reports will "ease the police's ability to harass, surveil and inflict violence on community members. While making the cop's job easier, it makes Black and brown people's lives harder".
[2]
Police officers are starting to use AI chatbots to write crime reports. Will they hold up in court?
OKLAHOMA CITY -- A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour. Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft. Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert's body camera, the AI tool churned out a report in eight seconds. "It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilbert said. It even documented a fact he didn't remember hearing -- another officer's mention of the color of the car the suspects ran from. Oklahoma City's police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who've tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned. Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another "game changer" for police work. "They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate," said Axon's founder and CEO Rick Smith, describing the new AI product -- called Draft One -- as having the "most positive reaction" of any product the company has introduced. "Now, there's certainly concerns," Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers -- not solely an AI chatbot -- are responsible for authoring their reports because they may have to testify in court about what they witnessed. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects' faces, detect gunshot sounds and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist aurelius francisco finds "deeply troubling" about the new tool, which he learned about from The Associated Press. "The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough," said francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City. He said automating those reports will "ease the police's ability to harass, surveil and inflict violence on community members. While making the cop's job easier, it makes Black and brown people's lives harder." Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it's only used for minor incident reports that don't lead to someone getting arrested. "So no arrests, no felonies, no violent crimes," said Oklahoma City police Capt. Jason Bussert, who handles information technology for the 1,170-officer department. That's not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case and it's been "incredibly popular" since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn't work well on patrols of the city's downtown bar district because of an "overwhelming amount of noise." Along with using AI to analyze and summarize the audio recording, Axon experimented with computer vision to summarize what's "seen" in the video footage, before quickly realizing that the technology was not ready. "Given all the sensitivities around policing, around race and other identities of people involved, that's an area where I think we're going to have to do some real work before we would introduce it," said Smith, the Axon CEO, describing some of the tested responses as not "overtly racist" but insensitive in other ways. Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials. The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon's cloud computing provider. "We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have," said Noah Spitzer-Williams, who manages Axon's AI products. Turning down the "creativity dial" helps the model stick to facts so that it "doesn't embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own," he said. Axon won't say how many police departments are using the technology. It's not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon's deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years. Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty." It's sometimes the only testimony a judge sees, especially for misdemeanor crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. For some officers who've tried it, it is already changing how they respond to a reported crime. They're narrating what's happening so the camera better captures what they'd want to put in writing. As the technology catches on, Bussert expects officers will become "more and more verbal" in describing what's in front of them. After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera. "It was literally seconds," Gilmore said, "and it was done to the point where I was like, 'I don't have anything to change.'" At the end of the report, the officer must click a box that indicates it was generated with the use of AI.
[3]
Police officers are starting to use AI chatbots to write crime reports. Will they hold up in court?
OKLAHOMA CITY -- A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour. Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft. Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert's body camera, the AI tool churned out a report in eight seconds. "It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilbert said. It even documented a fact he didn't remember hearing -- another officer's mention of the color of the car the suspects ran from. Oklahoma City's police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who've tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned. Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another "game changer" for police work. "They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate," said Axon's founder and CEO Rick Smith, describing the new AI product -- called Draft One -- as having the "most positive reaction" of any product the company has introduced. "Now, there's certainly concerns," Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers -- not solely an AI chatbot -- are responsible for authoring their reports because they may have to testify in court about what they witnessed. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects' faces, detect gunshot sounds and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist aurelius francisco finds "deeply troubling" about the new tool, which he learned about from The Associated Press. "The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough," said francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City. He said automating those reports will "ease the police's ability to harass, surveil and inflict violence on community members. While making the cop's job easier, it makes Black and brown people's lives harder." Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it's only used for minor incident reports that don't lead to someone getting arrested. "So no arrests, no felonies, no violent crimes," said Oklahoma City police Capt. Jason Bussert, who handles information technology for the 1,170-officer department. That's not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case and it's been "incredibly popular" since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn't work well on patrols of the city's downtown bar district because of an "overwhelming amount of noise." Along with using AI to analyze and summarize the audio recording, Axon experimented with computer vision to summarize what's "seen" in the video footage, before quickly realizing that the technology was not ready. "Given all the sensitivities around policing, around race and other identities of people involved, that's an area where I think we're going to have to do some real work before we would introduce it," said Smith, the Axon CEO, describing some of the tested responses as not "overtly racist" but insensitive in other ways. Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials. The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon's cloud computing provider. "We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have," said Noah Spitzer-Williams, who manages Axon's AI products. Turning down the "creativity dial" helps the model stick to facts so that it "doesn't embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own," he said. Axon won't say how many police departments are using the technology. It's not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon's deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years. Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty." It's sometimes the only testimony a judge sees, especially for misdemeanor crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. For some officers who've tried it, it is already changing how they respond to a reported crime. They're narrating what's happening so the camera better captures what they'd want to put in writing. As the technology catches on, Bussert expects officers will become "more and more verbal" in describing what's in front of them. After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera. "It was literally seconds," Gilmore said, "and it was done to the point where I was like, 'I don't have anything to change.'" At the end of the report, the officer must click a box that indicates it was generated with the use of AI. The Associated Press and OpenAI have a licensing and technology agreement that allows OpenAI access to part of AP's text archives.
[4]
Police officers are starting to use AI chatbots to write crime reports. Will they hold up in court?
OKLAHOMA CITY (AP) -- A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour. Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft. Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert's body camera, the AI tool churned out a report in eight seconds. "It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilbert said. It even documented a fact he didn't remember hearing -- another officer's mention of the color of the car the suspects ran from. Oklahoma City's police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who've tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned. Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another "game changer" for police work. "They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate," said Axon's founder and CEO Rick Smith, describing the new AI product -- called Draft One -- as having the "most positive reaction" of any product the company has introduced. "Now, there's certainly concerns," Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers -- not solely an AI chatbot -- are responsible for authoring their reports because they may have to testify in court about what they witnessed. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects' faces, detect gunshot sounds and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist aurelius francisco finds "deeply troubling" about the new tool, which he learned about from The Associated Press. "The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough," said francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City. He said automating those reports will "ease the police's ability to harass, surveil and inflict violence on community members. While making the cop's job easier, it makes Black and brown people's lives harder." Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it's only used for minor incident reports that don't lead to someone getting arrested. "So no arrests, no felonies, no violent crimes," said Oklahoma City police Capt. Jason Bussert, who handles information technology for the 1,170-officer department. That's not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case and it's been "incredibly popular" since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn't work well on patrols of the city's downtown bar district because of an "overwhelming amount of noise." Along with using AI to analyze and summarize the audio recording, Axon experimented with computer vision to summarize what's "seen" in the video footage, before quickly realizing that the technology was not ready. "Given all the sensitivities around policing, around race and other identities of people involved, that's an area where I think we're going to have to do some real work before we would introduce it," said Smith, the Axon CEO, describing some of the tested responses as not "overtly racist" but insensitive in other ways. Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials. The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon's cloud computing provider. "We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have," said Noah Spitzer-Williams, who manages Axon's AI products. Turning down the "creativity dial" helps the model stick to facts so that it "doesn't embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own," he said. Axon won't say how many police departments are using the technology. It's not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon's deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years. Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty." It's sometimes the only testimony a judge sees, especially for misdemeanor crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. For some officers who've tried it, it is already changing how they respond to a reported crime. They're narrating what's happening so the camera better captures what they'd want to put in writing. As the technology catches on, Bussert expects officers will become "more and more verbal" in describing what's in front of them. After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera. "It was literally seconds," Gilmore said, "and it was done to the point where I was like, 'I don't have anything to change.'" At the end of the report, the officer must click a box that indicates it was generated with the use of AI. The Associated Press and OpenAI have a licensing and technology agreement that allows OpenAI access to part of AP's text archives.
[5]
Cops are using AI to write reports, and it's changing how they respond to crimes
A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour. Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft. Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert's body camera, the AI tool churned out a report in eight seconds. "It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilbert said. It even documented a fact he didn't remember hearing -- another officer's mention of the color of the car the suspects ran from. Oklahoma City's police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who've tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned. Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another "game changer" for police work. "They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate," said Axon's founder and CEO Rick Smith, describing the new AI product -- called Draft One -- as having the "most positive reaction" of any product the company has introduced. "Now, there's certainly concerns," Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers -- not solely an AI chatbot -- are responsible for authoring their reports because they may have to testify in court about what they witnessed. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects' faces, detect gunshot soundsand predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist aurelius francisco finds "deeply troubling" about the new tool, which he learned about from The Associated Press. "The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough," said francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City. He said automating those reports will "ease the police's ability to harass, surveil and inflict violence on community members. While making the cop's job easier, it makes Black and brown people's lives harder." Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it's only used for minor incident reports that don't lead to someone getting arrested. "So no arrests, no felonies, no violent crimes," said Oklahoma City police Capt. Jason Bussert, who handles information technology for the 1,170-officer department. That's not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case and it's been "incredibly popular" since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn't work well on patrols of the city's downtown bar district because of an "overwhelming amount of noise." Along with using AI to analyze and summarize the audio recording, Axon experimented with computer vision to summarize what's "seen" in the video footage, before quickly realizing that the technology was not ready. "Given all the sensitivities around policing, around race and other identities of people involved, that's an area where I think we're going to have to do some real work before we would introduce it," said Smith, the Axon CEO, describing some of the tested responses as not "overtly racist" but insensitive in other ways. Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials. The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon's cloud computing provider. "We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have," said Noah Spitzer-Williams, who manages Axon's AI products. Turning down the "creativity dial" helps the model stick to facts so that it "doesn't embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own," he said. Axon won't say how many police departments are using the technology. It's not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon's deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years. Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty." It's sometimes the only testimony a judge sees, especially for misdemeanor crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. For some officers who've tried it, it is already changing how they respond to a reported crime. They're narrating what's happening so the camera better captures what they'd want to put in writing. As the technology catches on, Bussert expects officers will become "more and more verbal" in describing what's in front of them. After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera. "It was literally seconds," Gilmore said, "and it was done to the point where I was like, 'I don't have anything to change.'" At the end of the report, the officer must click a box that indicates it was generated with the use of AI.
[6]
Police officers are starting to use AI chatbots to write crime reports. Will they hold up in court?
OKLAHOMA CITY (AP) -- A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour. Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft. Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert's body camera, the AI tool churned out a report in eight seconds. "It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilbert said. It even documented a fact he didn't remember hearing -- another officer's mention of the color of the car the suspects ran from. Oklahoma City's police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who've tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned. Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another "game changer" for police work. "They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate," said Axon's founder and CEO Rick Smith, describing the new AI product -- called Draft One -- as having the "most positive reaction" of any product the company has introduced. "Now, there's certainly concerns," Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers -- not solely an AI chatbot -- are responsible for authoring their reports because they may have to testify in court about what they witnessed. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects' faces, detect gunshot sounds and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist aurelius francisco finds "deeply troubling" about the new tool, which he learned about from The Associated Press. "The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough," said francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City. He said automating those reports will "ease the police's ability to harass, surveil and inflict violence on community members. While making the cop's job easier, it makes Black and brown people's lives harder." Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it's only used for minor incident reports that don't lead to someone getting arrested. "So no arrests, no felonies, no violent crimes," said Oklahoma City police Capt. Jason Bussert, who handles information technology for the 1,170-officer department. That's not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case and it's been "incredibly popular" since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn't work well on patrols of the city's downtown bar district because of an "overwhelming amount of noise." Along with using AI to analyze and summarize the audio recording, Axon experimented with computer vision to summarize what's "seen" in the video footage, before quickly realizing that the technology was not ready. "Given all the sensitivities around policing, around race and other identities of people involved, that's an area where I think we're going to have to do some real work before we would introduce it," said Smith, the Axon CEO, describing some of the tested responses as not "overtly racist" but insensitive in other ways. Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials. The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon's cloud computing provider. "We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have," said Noah Spitzer-Williams, who manages Axon's AI products. Turning down the "creativity dial" helps the model stick to facts so that it "doesn't embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own," he said. Axon won't say how many police departments are using the technology. It's not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon's deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years. Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty." It's sometimes the only testimony a judge sees, especially for misdemeanor crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. For some officers who've tried it, it is already changing how they respond to a reported crime. They're narrating what's happening so the camera better captures what they'd want to put in writing. As the technology catches on, Bussert expects officers will become "more and more verbal" in describing what's in front of them. After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera. "It was literally seconds," Gilmore said, "and it was done to the point where I was like, 'I don't have anything to change.'" At the end of the report, the officer must click a box that indicates it was generated with the use of AI. The Associated Press and OpenAI have a licensing and technology agreement that allows OpenAI access to part of AP's text archives.
[7]
Police Officers Are Starting to Use AI Chatbots to Write Crime Reports. Will They Hold up in Court?
OKLAHOMA CITY (AP) -- A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour. Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft. Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert's body camera, the AI tool churned out a report in eight seconds. "It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilbert said. It even documented a fact he didn't remember hearing -- another officer's mention of the color of the car the suspects ran from. Oklahoma City's police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who've tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned. Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another "game changer" for police work. "They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate," said Axon's founder and CEO Rick Smith, describing the new AI product -- called Draft One -- as having the "most positive reaction" of any product the company has introduced. "Now, there's certainly concerns," Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers -- not solely an AI chatbot -- are responsible for authoring their reports because they may have to testify in court about what they witnessed. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects' faces, detect gunshot sounds and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist aurelius francisco finds "deeply troubling" about the new tool, which he learned about from The Associated Press. "The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough," said francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City. He said automating those reports will "ease the police's ability to harass, surveil and inflict violence on community members. While making the cop's job easier, it makes Black and brown people's lives harder." Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it's only used for minor incident reports that don't lead to someone getting arrested. "So no arrests, no felonies, no violent crimes," said Oklahoma City police Capt. Jason Bussert, who handles information technology for the 1,170-officer department. That's not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case and it's been "incredibly popular" since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn't work well on patrols of the city's downtown bar district because of an "overwhelming amount of noise." Along with using AI to analyze and summarize the audio recording, Axon experimented with computer vision to summarize what's "seen" in the video footage, before quickly realizing that the technology was not ready. "Given all the sensitivities around policing, around race and other identities of people involved, that's an area where I think we're going to have to do some real work before we would introduce it," said Smith, the Axon CEO, describing some of the tested responses as not "overtly racist" but insensitive in other ways. Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials. The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon's cloud computing provider. "We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have," said Noah Spitzer-Williams, who manages Axon's AI products. Turning down the "creativity dial" helps the model stick to facts so that it "doesn't embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own," he said. Axon won't say how many police departments are using the technology. It's not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon's deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years. Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty." It's sometimes the only testimony a judge sees, especially for misdemeanor crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. For some officers who've tried it, it is already changing how they respond to a reported crime. They're narrating what's happening so the camera better captures what they'd want to put in writing. As the technology catches on, Bussert expects officers will become "more and more verbal" in describing what's in front of them. After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera. "It was literally seconds," Gilmore said, "and it was done to the point where I was like, 'I don't have anything to change.'" At the end of the report, the officer must click a box that indicates it was generated with the use of AI. The Associated Press and OpenAI have a licensing and technology agreement that allows OpenAI access to part of AP's text archives. Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[8]
Police officers are using AI to write crime reports. Will they hold up in court?
OKLAHOMA CITY -- A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour. Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft. Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert's body camera, the AI tool churned out a report in eight seconds. "It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilbert said. It even documented a fact he didn't remember hearing -- another officer's mention of the color of the car the suspects ran from. Oklahoma City's police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who've tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned. Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another "game changer" for police work. "They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate," said Axon's founder and CEO Rick Smith, describing the new AI product -- called Draft One -- as having the "most positive reaction" of any product the company has introduced. "Now, there's certainly concerns," Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers -- not solely an AI chatbot -- are responsible for authoring their reports because they may have to testify in court about what they witnessed. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects' faces, detect gunshot sounds and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist aurelius francisco finds "deeply troubling" about the new tool, which he learned about from The Associated Press. "The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough," said francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City. He said automating those reports will "ease the police's ability to harass, surveil and inflict violence on community members. While making the cop's job easier, it makes Black and brown people's lives harder." Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors who advised some caution before using it on high-stakes criminal cases. For now, it's only used for minor incident reports that don't lead to someone getting arrested. "So no arrests, no felonies, no violent crimes," said Oklahoma City police Capt. Jason Bussert, who handles information technology for the 1,170-officer department. That's not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case and it's been "incredibly popular" since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn't work well on patrols of the city's downtown bar district because of an "overwhelming amount of noise." Along with using AI to analyze and summarize the audio recording, Axon experimented with computer vision to summarize what's "seen" in the video footage, before quickly realizing that the technology was not ready. "Given all the sensitivities around policing, around race and other identities of people involved, that's an area where I think we're going to have to do some real work before we would introduce it," said Smith, the Axon CEO, describing some of the tested responses as not "overtly racist" but insensitive in other ways. Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials. The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon's cloud computing provider. "We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have," said Noah Spitzer-Williams, who manages Axon's AI products. Turning down the "creativity dial" helps the model stick to facts so that it "doesn't embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own," he said. Axon won't say how many police departments are using the technology. It's not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon's deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years. Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty." It's sometimes the only testimony a judge sees, especially for misdemeanor crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. For some officers who've tried it, it is already changing how they respond to a reported crime. They're narrating what's happening so the camera better captures what they'd want to put in writing. As the technology catches on, Bussert expects officers will become "more and more verbal" in describing what's in front of them. After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera. "It was literally seconds," Gilmore said, "and it was done to the point where I was like, 'I don't have anything to change.'" At the end of the report, the officer must click a box that indicates it was generated with the use of AI. The Associated Press and OpenAI have a licensing and technology agreement that allows OpenAI access to part of AP's text archives.
[9]
Police officers are starting to use AI chatbots to write crime reports. Will they hold up in court? - Times of India
OKLAHOMA CITY: A body camera captured every word and bark uttered as police Sgt Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour. Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time, he had artificial intelligence write the first draft. Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert's body camera, the AI tool churned out a report in eight seconds. "It was a better report than I could have ever written, and it was 100% accurate. It flowed better," Gilbert said. It even documented a fact he didn't remember hearing -- another officer's mention of the colour of the car the suspects ran from. Oklahoma City's police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who've tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs, and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned. Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another "game changer" for police work. "They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate," said Axon's founder and CEO Rick Smith, describing the new AI product - called Draft One - as having the "most positive reaction" of any product the company has introduced. "Now, there's certainly concern," Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers -- not solely an AI chatbot -- are responsible for authoring their reports because they may have to testify in court about what they witnessed. "They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read licence plates, recognise suspects' faces, detect gunshot sounds, and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use. Concerns about society's racial biases and prejudices getting built into AI technology are just part of what Oklahoma City community activist Aurelius Francisco finds "deeply troubling" about the new tool, which he learnt about from The Associated Press. "The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough," said Francisco, a co-founder of the Foundation for Liberating Minds in Oklahoma City. He said automating those reports will "ease the police's ability to harass, surveil, and inflict violence on community members. While making the cop's job easier, it makes black and brown people's lives harder." Before trying out the tool in Oklahoma City, police officials showed it to local prosecutors, who advised some caution before using it in high-stakes criminal cases. For now, it's only used for minor incident reports that don't lead to someone getting arrested. "So no arrests, no felonies, no violent crimes," said Oklahoma City police Capt. Jason Bussert, who handles information technology for the 1,170-officer department. That's not the case in another city, Lafayette, Indiana, where Police Chief Scott Galloway told the AP that all of his officers can use Draft One on any kind of case, and it's been "incredibly popular" since the pilot began earlier this year. Or in Fort Collins, Colorado, where police Sgt. Robert Younger said officers are free to use it on any type of report, though they discovered it doesn't work well on patrols of the city's downtown bar district because of an "overwhelming amount of noise." Along with using AI to analyze and summarise the audio recording, Axon experimented with computer vision to summarise what's "seen" in the video footage, before quickly realising that the technology was not ready. "Given all the sensitivities around policing, around race, and other identities of people involved, that's an area where I think we're going to have to do some real work before we introduce it," said Smith, the Axon CEO, describing some of the tested responses as not "overtly racist" but insensitive in other ways. Those experiments led Axon to focus squarely on audio in the product unveiled in April during its annual company conference for police officials. The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon's cloud computing provider. "We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have," said Noah Spitzer-Williams, who manages Axon's AI products. Turning down the "creativity dial" helps the model stick to facts so that it "doesn't embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own," he said. Axon won't say how many police departments are using the technology. It's not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon's deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years. Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods to a police report. "I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing," said Ferguson, a law professor at American University working on what's expected to be the first law review article on the emerging technology. Ferguson said a police report is important in determining whether an officer's suspicion "justifies someone's loss of liberty." It's sometimes the only testimony a judge sees, especially for misdemeanour crimes. Human-generated police reports also have flaws, Ferguson said, but it's an open question as to which is more reliable. For some officers who've tried it, it is already changing how they respond to a reported crime. They're narrating what's happening, so the camera better captures what they'd want to put in writing. As the technology catches on, Bussert expects officers to become "more and more verbal" in describing what's in front of them. After Bussert loaded the video of a traffic stop into the system and pressed a button, the programme produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera. "It was literally seconds," Gilmore said, "and it was done to the point where I was like, 'I don't have anything to change.'" At the end of the report, the officer must click a box that indicates it was generated with the use of AI.
[10]
Cops are using AI software to write police reports
Axon says it adjusted its LLM's 'creativity dial' to reduce hallucinations. Police departments are often some of the tech industry's earliest adopters of new products like drones, facial recognition, predictive software, and now-artificial intelligence. After already embracing AI audio transcription programs, some departments are now testing a new, more comprehensive tool -- software that leverages technology similar to ChatGPT to auto-generate police reports. According to an August 26 report from Associated Press, many officers are already "enthused" by the generative AI tool that claims to shave 30-45 minutes from routine officework. Initially announced in April by Axon, Draft One is billed as the "latest giant leap toward [the] moonshot goal to reduce gun-related deaths between police and the public." The company -- best known for Tasers and law enforcement's most popular lines of body cams -- claims its initial trials cut an hour of paperwork per day for users. "When officers can spend more time connecting with the community and taking care of themselves both physically and mentally, they can make better decisions that lead to more successful de-escalated outcomes," Axon said in its reveal. The company stated at the time that Draft One is built with Microsoft's Azure OpenAI platform, and automatically transcribes police body camera audio before "leveraging AI to create a draft narrative quickly." Reports are "drafted strictly from the audio transcript" following Draft One's "underlying model... to prevent speculation or embellishments." After additional key information is added, officers must sign-off on a report's accuracy before it is for another round of human review. Each report is also flagged if AI was involved in writing it. [Related: ChatGPT has been generating bizarre nonsense (more than usual).] Speaking with AP on Monday, Axon's AI products manager, Noah Spitzer-Williams, claims Draft One uses the "same underlying technology as ChatGPT." Designed by OpenAI, ChatGPT's baseline generative large language model has been frequently criticized for its tendency to provide misleading or false information in its responses. Spitzer-Williams, however, likens Axon's abilities to having "access to more knobs and dials" than are available to casual ChatGPT users. Adjusting its "creativity dial" allegedly helps Draft One keep its police reports factual and avoid generative AI's ongoing hallucination issues. Draft One's scope currently appears to vary by department. Oklahoma City police Capt. Jason Bussert claimed his 1,170-officer department currently only uses Draft One for "minor incident reports" that don't involve arrests. But in Lafayette, Indiana, AP reports the police who serve the town's nearly 71,000 residents have free rein to use Draft One "on any kind of case." Faculty at Lafayette's neighboring Purdue University, meanwhile, argue generative AI simply isn't reliable enough to handle potentially life-altering situations as run-ins with the police. "The large language models underpinning tools like ChatGPT are not designed to generate truth. Rather, they string together plausible sounding sentences based on prediction algorithms," says Lindsey Weinberg, a Purdue clinical associate professor focusing on digital and technological ethics, in a statement to Popular Science. "The use of tools that make it 'easier' to generate police reports in the context of a legal system that currently supports and sanctions the mass incarceration of [marginalized populations] should be deeply concerning to those who care about privacy, civil rights, and justice," Weinberg says. In an email to Popular Science, an OpenAI representative suggested inquiries be directed to Microsoft. Axon, Microsoft, and the Lafayette Police Department did not respond to requests for comment at the time of writing.
[11]
Police Officers Are Starting To Use AI To Write Crime Reports. Will They Hold Up In Court?
"They never want to get an officer on the stand who says, well, 'The AI wrote that, I didn't,'" Smith said. AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects' faces, detect gunshot sounds and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use.
Share
Share
Copy Link
Some US police departments are experimenting with AI chatbots to write crime reports, aiming to save time and improve efficiency. However, this practice has sparked debates about accuracy, racial bias, and the potential impact on the justice system.
In a significant technological shift, some US police departments have begun utilizing AI chatbots to assist in writing crime reports. This move, aimed at streamlining administrative tasks, has garnered attention from both law enforcement agencies and civil rights advocates 1.
Proponents argue that AI-assisted report writing could save officers valuable time, allowing them to spend more hours on patrol and community engagement. Axon, a major supplier of body cameras and Tasers, has introduced a product that uses AI to draft reports based on video footage and voice recordings 2.
However, the adoption of AI in policing has raised significant concerns. Critics worry about the potential for racial bias, inaccuracies, and the impact on the justice system. Civil liberties groups have pointed out that AI systems can perpetuate existing biases in policing, potentially exacerbating racial disparities 3.
Despite concerns, some departments are moving forward with AI implementation. In Alameda, California, police have been testing AI software to generate reports from body-camera video. The department emphasizes that officers review and edit AI-generated text before finalizing reports 4.
The use of AI in policing extends beyond report writing. Some agencies are exploring AI for emergency call responses and other administrative tasks. This trend reflects a broader push towards technological solutions in law enforcement, aimed at addressing staffing shortages and improving efficiency 5.
As police departments continue to explore AI applications, the challenge lies in balancing technological innovation with accountability and fairness. Experts stress the need for robust oversight, regular audits of AI systems, and maintaining human judgment in critical decision-making processes.
The integration of AI in police work is likely to continue evolving. As departments experiment with these tools, ongoing discussions about ethics, privacy, and the role of technology in law enforcement will shape policies and practices. The outcome of these early adoptions may set precedents for the future of AI in policing across the United States.
Reference
[1]
[2]
[3]
[4]
The American Civil Liberties Union (ACLU) has raised alarm over the increasing use of AI in drafting police reports, highlighting potential threats to civil liberties and the integrity of the justice system.
3 Sources
3 Sources
As the volume of police body camera footage grows, human reviewers struggle to keep up. Law enforcement agencies are turning to AI for assistance, sparking debates about privacy and accountability.
2 Sources
2 Sources
The International Association of Chiefs of Police conference showcases AI's growing role in law enforcement, from virtual reality training to integrated data systems, highlighting the push for widespread adoption.
2 Sources
2 Sources
A reporter for the Cody Enterprise in Wyoming has resigned after being caught using artificial intelligence to create fake quotes and stories. The incident has raised concerns about the use of AI in journalism and its potential to undermine trust in news reporting.
9 Sources
9 Sources
An exploration of how AI is impacting the criminal justice system, highlighting both its potential benefits and significant risks, including issues of bias, privacy, and the challenges of deepfake evidence.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved