Curated by THEOUTPOST
On Wed, 7 May, 8:03 AM UTC
44 Sources
[1]
This man was killed four years ago. His AI clone just spoke in court.
People just can't stop using generative AI tools in legal proceedings, despite repeated pushback from frustrated judges. While AI initially appeared in courtrooms through bogus "hallucinated" cases the trend has taken a turn -- driven by increasingly sophisticated AI video and audio tools. In some instances, AI is even being used to seemingly bring victims back from the dead. This week, a crime victim's family presented a brief video in an Arizona courtroom depicting an AI version of 37-year-old Chris Pelkey. Pelkey was shot and killed in 2021 in a road rage incident. Now, four years later, the AI-generated "clone" appeared to address his alleged killer in court. The video, first reported by local outlet ABC15, appears to be the first known example of a generative AI deepfake used in a victim impact statement. "To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," the AI replica of Pelkey says in the video. "In another life, we probably could have been friends." The video shows the AI version of Pelkey -- a burly, bearded Army veteran -- wearing a green hoodie and gray baseball cap. Pelkey's family reportedly created the video by training an AI model on various clips of Pelkey. An "old age" filter was then applied to simulate what Pelkey might look like today. In the end, the judge sentenced Horcasitas to 10.5 years in prison for manslaughter, a decision he said was at least partly influenced by the AI-generated impact statement. "This is the best I can ever give you of what I would have looked like if I got the chance to grow old," the Pelkey deepfake said. "Remember, getting old is a gift that not everybody has, so embrace it and stop worrying about those wrinkles." The AI-generated impact statement comes just a month after a defendant in New York State court, 74-year-old Jerome Dewald, used a deepfake video to assist in delivering his own legal defense. When Dewald appeared in court over a contract dispute with a former employer, he presented a video showing a man in a sweater and blue dress shirt speaking directly to the camera. The judge, confused by the video, asked Dewald if the person on screen was his attorney. In reality, it was an AI-generated deepfake. "I generated that," Dewald said according to The New York Times. "That is not a real person." The judge wasn't pleased and reprimanded Dewald for failing to disclose that he had used AI software to aid his defense. Speaking with the NYT after the hearing, Dewald claimed he hadn't intended to mislead the court but used the AI tool as a way to more clearly articulate his defense. He said he initially planned to have the deepfake resemble himself but switched to the version shown in court after encountering technical difficulties. "My intent was never to deceive but rather to present my arguments in the most efficient manner possible," Dewald reportedly said in a letter to the judges. Related: [This AI chatbot will be playing attorney in a real US court] The two cases represent the latest examples of generative AI seeping into courtrooms, a trend that began gaining traction several years ago following the surge of public interest in popular chatbots like OpenAI's ChatGPT. Lawyers across the country have reportedly used these large language models to help draft legal filings and collect information. That has led to some embarrassing instances where models have "hallucinated" entirely fabricated case names and facts that eventually make their way into legal proceedings. In 2023, two New York-based lawyers were sanctioned by a judge after they submitted a brief containing six fake case citations generated by ChatGPT. Michael Cohen, the former personal lawyer of President Donald Trump, reportedly sent fake AI-generated legal cases to his attorney that ended up in a motion submitted to federal judges. Another lawyer in Colorado was suspended after reportedly submitting AI-generated legal cases. OpenAI has even been sued by a Georgia radio host who claimed a ChatGPT response accused him of being involved in a real embezzlement case he had nothing to do with. Though courts have punished attorneys and defendants for using AI in ways that appear deceptive, the rules around whether it's ever acceptable to use these tools remain murky. Just last week, a federal judicial panel voted 8-1 to seek public comment on a draft rule aimed at ensuring that AI-assisted evidence meets the same standards as evidence presented by human expert witnesses. Supreme Court Chief Justice John Roberts also addressed the issue in his 2023 annual report, noting both the potential benefits and drawbacks of allowing more generative AI in the courtroom. On one hand, he observed, AI could make it easier for people with limited financial resources to defend themselves. At the same time, he warned that the technology risks "invading privacy interests and dehumanizing the law." One thing seems certain: We haven't seen the last of AI deepakes in courtrooms.
[2]
Family creates AI video to depict Arizona man addressing his killer in court
CHANDLER, ARIZONA, May 9 (Reuters) - A simulation of a dead man created by artificial intelligence addressed his killer in an Arizona court this month, in what appears to be one of the first such instances in a U.S. courtroom. Made by his family, an AI-generated avatar of Christopher Pelkey spoke in Maricopa County Superior Court on May 1, as a judge prepared to sentence Gabriel Paul Horcasitas for shooting and killing Pelkey in a 2021 road-rage incident. "It is a shame we encountered each other that day in those circumstances," the Pelkey avatar says in the video. "In another life, we probably could have been friends." The Pelkey avatar appears in the video sporting a long beard and green sweatshirt against a white backdrop. He cautions at the start that he is an AI-version of Pelkey, which is apparent through the gaps in audio and slightly mismatched movement of his mouth. Pelkey, a U.S. Army veteran, was 37 at the time of the shooting. The video marked a novel use of AI in the legal system, which has viewed the rapidly growing technology with a mix of fascination and trepidation. Courts generally have strict rules on the types of information that can be presented in legal proceedings, and several lawyers have been sanctioned after AI systems created fake cases that they cited in legal briefs. Pelkey's relatives were given more leeway to present the AI-generated video to the judge at sentencing, given that it was not evidence in the case. Horcasitas, who was sentenced to 10.5 years in state prison, had already been convicted on manslaughter and endangerment charges. Pelkey's sister Stacey Wales said she scripted the AI-generated message after struggling to convey years of grief and pain in her own statement. She said she was not ready to forgive Horcasitas, but felt her brother would have a more understanding outlook. "The goal was to humanize Chris, to reach the judge, and let him know his impact on this world and that he existed," she told Reuters. Generative AI, Wales said, is "just another avenue that you can use to reach somebody." Wales said she worked with her husband and a family friend, who all work in the tech industry, to create it. Harry Surden, a law professor at the University of Colorado, said the use of generative AI material in court raises ethical concerns, as others may seek to use those tools to play on the emotions of judges and juries. The content is a simulation of reality, not the verified evidence that courts typically assess, Surden said. "What we're seeing is the simulations have gotten so good that it completely bypasses our natural skepticism and goes straight to our emotion," he said. Reporting by Liliana Salgado in Chandler, Arizona and Andrew Goudsward in Washington; editing by Andy Sullivan and Aurora Ellis Our Standards: The Thomson Reuters Trust Principles., opens new tab Suggested Topics:Media & Telecom
[3]
He was killed in a road rage shooting. AI allowed him to deliver his own victim impact statement
CHANDLER, Ariz. (AP) -- There were dozens of statements submitted to the court by family and friends of Christopher Pelkey when it came time to sentence the man convicted of shooting him during a road rage incident. They provided glimpses of Pelkey's humor, his character and his military service. But there was nothing quite like hearing from the victim himself -- in this case, an AI-generated version. In what's believed to be a first in U.S. courts, Pelkey's family used artificial intelligence to create a video using his likeness to give him a voice. The AI rendering of Pelkey told the shooter during the sentencing hearing last week that it was a shame they had to meet that day in 2021 under those circumstances -- and that the two of them probably could have been friends in another life. "I believe in forgiveness and in God who forgives. I always have and I still do," Pelkey's avatar told Gabriel Paul Horcasitas. The AI version of Pelkey went on to share advice for people to make the most of each day and to love each other, not knowing how much time one might have left. While use of artificial intelligence within the court system is expanding, it's typically been reserved for administrative tasks, legal research, case preparation. In Arizona, it's helped inform the public of rulings in significant cases. Using AI to generate victim impact statements marks a new -- and legal, at least in Arizona -- tool for sharing information with the court outside the evidentiary phases. Maricopa County Superior Court Judge Todd Lang, who presided over the road rage case, said after watching the video that he imagined Pelkey, who was 37 at the time of his killing, would have felt that way after learning about him. Lang also noted the video said something about Pelkey's family, who had expressed their anger over his death and had asked for Horcasitas to receive the maximum sentence. Horcasitas was convicted of manslaughter and sentenced to 10.5 imprisonment. "Even though that's what you wanted, you allowed Chris to speak from his heart as you saw it," Lang said. The Associated Press left phone and emailed messages Wednesday seeking comment from Horcasitas' lawyer. Pelkey's sister, Stacey Wales, raised the idea of her brother speaking for himself. For years, while the case worked its way through the legal system, Wales said she thought about what she would say at the sentencing hearing. She struggled to get words down on paper. But when she thought about what her brother would say to the shooter, knowing he would have forgiven him, the words poured out of her. In Arizona, victims can give their impact statements in any digital format, said victims' rights attorney Jessica Gattuso, who represented the family. Arizona Supreme Court Justice Ann Timmer didn't address the road rage case specifically in an interview Wednesday. But she said the rise in popularity and accessibility to AI in recent years led to the formation of a committee to research best practices in the courts. Gary Marchant, a member of the committee and a law professor at Arizona State University, said he understands why Pelkey's family did it. But he warned the use of this technology could open the door to more people trying to introduce AI-generated evidence into courtrooms. "There's a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used," he said. "It's easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see." Marchant pointed to a recent case in New York, where a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges on the appeals court to realize that the man addressing them from the video screen didn't exist at all. In the Arizona case, Wales said the AI-generated video worked because the judge had nearly 50 letters from family and friends that echoed the video's message. "There was a solid gold thread through all of those stories -- that was the heart of Chris," Wales said. "This works because it talks about the kind of person Chris was." ___ Yamat reported from Las Vegas. Associated Press reporter Susan Montoya Bryan in Albuquerque, New Mexico, contributed to this report.
[4]
From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system
Stacey Wales gripped the lectern, choking back tears as she asked the judge to give the man who shot and killed her brother the maximum possible sentence for manslaughter. What appeared next stunned those in the Phoenix courtroom last week: An AI-generated video with a likeness of her brother, Christopher Pelkey, told the shooter he was forgiven. The judge said he loved and appreciated the video, then sentenced the shooter to 10.5 years in prison -- the maximum sentence and more than what prosecutors sought. Within hours of the hearing on May 1, the defendant's lawyer filed a notice of appeal. Defense attorney Jason Lamm won't be handling the appeal, but said a higher court will likely be asked to weigh in on whether the judge improperly relied on the AI-generated video when sentencing his client. Courts across the country have been grappling with how to best handle the increasing presence of artificial intelligence in the courtroom. Even before Pelkey's family used AI to give him a voice for the victim impact portion -- believed to be a first in U.S. courts -- the Arizona Supreme Court created a committee that researches best AI practices. In Florida, a judge recently donned a virtual reality headset meant to show the point of view of a defendant who said he was acting in self-defense when he waved a loaded gun at wedding guests. The judge rejected his claim. And in New York, a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen wasn't real. Experts say using AI in courtrooms raises legal and ethical concerns, especially if it's used effectively to sway a judge or jury. And they argue it could have a disproportionate impact on marginalized communities facing prosecution. "I imagine that will be a contested form of evidence, in part because it could be something that advantages parties that have more resources over parties that don't," said David Evan Harris, an expert on AI deep fakes at UC Berkeley's business school. AI can be very persuasive, Harris said, and scholars are studying the intersection of the technology and manipulation tactics. Cynthia Godsoe, a law professor at Brooklyn Law School and a former public defender, said as this technology continues to push the boundaries of traditional legal practices, courts will have to confront questions they have never before had to weigh: Does this AI photograph really match the witness's testimony? Does this video exaggerate the suspect's height, weight, or skin color? "It's definitely a disturbing trend," she said, "because it could veer even more into fake evidence that maybe people don't figure out is false." In the Arizona case, the victim's sister told The Associated Press that she did consider the "ethics and morals" of writing a script and using her brother's likeness to give him a voice during the sentencing hearing. "It was important to us to approach this with ethics and morals and to not use it to say things that Chris wouldn't say or believe," Stacey Wales said. Victims can give their impact statements in any digital format in Arizona, said victims' rights attorney Jessica Gattuso, who represented the family. When the video played in the courtroom, Wales said only she and her husband knew about it. "The goal was to humanize Chris and to reach the judge," Wales said. After viewing it, Maricopa County Superior Court Judge Todd Lang said he "loved the beauty in what Christopher" said in the AI video. "It also says something about the family," he said. "Because you told me how angry you were, and you demanded the maximum sentence, and even though that's what you wanted, you allowed Chris to speak from his heart as you saw it." On appeal, the defendant's lawyer said, the judge's comments could be a factor for the sentence to be overturned. ___ Associated Press reporters Sarah Parvini in Los Angeles, Sejal Govindarao in Phoenix and Kate Payne in Tallahassee, Florida, contributed to this report.
[5]
Deepfake of deceased man gives his own impact statement in court
The defense says the use of AI creates a strong case for appeal. The AI-generated deepfake of a deceased road rage victim gave his own impact statement in court at the sentencing hearing of the defendent, . This is likely the first time the technology has been used in this way. The idea of using an AI version of the victim, Christopher Pelkey, came from his family, according to a Maricopa County Attorney's Office spokesperson. Pelkey's sister said she had been writing the impact statement for two years but found that what she had to say "did not seem like it would do justice" to his memory. Pelkey was shot and killed in 2021 during a road rage incident. So the idea of bringing in a deepfake avatar was born. Pelkey's sister wrote the script, telling CNN that she was sure "it's what he would think." Maricopa County Superior Court Judge Todd Lang approved the idea and the family played a video of the AI-generated Pelkey in court. In the video, the avatar actually seemed to ask for leniency when sentencing his killer. The defendant was convicted of manslaughter and endangerment earlier this year. "To Gabriel Horcasitas, the man who shot me: It is a shame we encountered each other that day in those circumstances," the artificial version of Pelkey said. "In another life, we probably could have been friends. I believe in forgiveness." However, the judge issued the maximum sentence of over 10 years in prison. "I heard the forgiveness," he said about the AI-generated avatar. "I feel like that was genuine, that his obvious forgiveness of Mr. Horcasitas reflects the character I heard about [Pelkey] today." The defense has stated that the AI presentation creates a strong issue for appeal. "While judges certainly have latitude as to what to hear, particularly from victims, an appellate court will have to decide if this was error," defense lawyer Jason Lamm said. The case has already been retried for procedural issues. Arizona State University law professor Gary Marchant, who specializes in ethics and emerging technologies, is worried about the legal precedent set here. "You see that person in the courtroom actually speaking, and in reality, they're dead and they're not speaking," he told NBC News. "So this is an extra jump that I feel is going to get us into dangerous grounds."
[6]
Reincarnated by A.I., Arizona Man Forgives His Killer at Sentencing
A likeness of Christopher Pelkey, who was killed in a 2021 road rage episode, was created with artificial intelligence. It was part of a victim's impact statement. The letters came streaming in: from battalion brothers who had served alongside Christopher Pelkey in Iraq and Afghanistan, fellow missionaries and even a prom date. A niece and nephew addressed the court. Still, the voice that mattered most to Mr. Pelkey's older sister, Stacey Wales, would most likely never be heard when it was time for an Arizona judge to sentence the man who killed her brother during a 2021 road rage episode -- the victim's. Ms. Wales, 47, had a thought. What if her brother, who was 37 and had done three combat tours of duty in the U.S. Army, could speak for himself at the sentencing? And what would he tell Gabriel Horcasitas, 54, the man convicted of manslaughter in his case? The answer came on May 1, when Ms. Wales clicked the play button on a laptop in a courtroom in Maricopa County, Ariz. A likeness of her brother appeared on an 80-inch television screen, the same one that had previously displayed autopsy photos of Mr. Pelkey and security camera footage of his being fatally shot at an intersection in Chandler, Ariz. It was created with artificial intelligence. "It is a shame we encountered each other that day in those circumstances," the avatar of Mr. Pelkey said. "In another life, we probably could have been friends. I believe in forgiveness and in God, who forgives. I always have and I still do." While the use of A.I. has spread through society, from the written word to memes and deepfakes, its use during the sentencing of Mr. Horcacitas, who got the maximum 10 and a half years in prison, appeared to be uncharted. It reverberated far beyond the courtroom, drawing headlines, questions and debate. Critics argued that the introduction of A.I. in legal proceedings could open the door to manipulation and deception, compounding an already emotional process of giving victim impact statements. One thing was certain: The nearly four-minute video made a favorable impression on the judge, Todd Lang, of the Maricopa County Superior Court, who complimented its inclusion moments before sentencing Mr. Horcasitas. "I loved that A.I.," Judge Lang said, describing the video's message as genuine. "Thank you for that. And as angry as you are, and justifiably angry as the family is, I heard the forgiveness. And I know Mr. Horcasitas appreciated it, but so did I." Much in the same way that social media apps have been placing labels on A.I.-generated content, the video opened with a disclaimer. "Hello, just to be clear, for everyone seeing this, I am a version of Chris Pelkey recreated through A.I. that uses my picture and my voice profile," it said. "I was able to be digitally regenerated to share with you today." While many states provide an opportunity for victims and their families to address the court during sentencings, some are more restrictive in the use of video presentations and photographs, according to legal experts. But victims have broader latitude in Arizona. Ms. Wales said in an interview on Wednesday that she had discovered that fact as she bounced the idea of using A.I. off a victims' rights lawyer who represented Mr. Pelkey's family. "She says, 'I don't think that's ever been done before,'" Ms. Wales said. Ms. Wales had been preparing her victim's impact statement for two years, she said, but it was missing a critical element. "I kept hearing what Chris would say," she said. Ms. Wales said that she then enlisted the help of her husband and their longtime business partner, who had used A.I. to help corporate clients with presentations, including one featuring a likeness of a company's chief executive who had died years ago. They took Mr. Pelkey's voice from a YouTube video that they had found of him speaking after completing treatment for PTSD at a facility for veterans, she said. For his face and torso, they used a poster of Mr. Pelkey from a funeral service, digitally trimming his thick beard, removing his glasses and editing out a logo from his cap, she said. Ms. Wales said that she had written the script that was read by the A.I. likeness of her brother. "I know that A.I. can be used nefariously, and it's uncomfortable for some," Ms. Wales said. "But this was just another tool to use to tell Chris's story." Vanessa Ceja-Cervantes, a spokeswoman for the Maricopa County attorney, said in an email that the office was not aware of A.I. being used before to give a victim's impact statement. Jason D. Lamm, a defense lawyer for Mr. Horcasitas, said in an interview that it would have been difficult to block the video from being shown. "Victims generally have extremely broad latitude to make their voices heard at sentencing, and the rules of evidence don't apply at sentencing," Mr. Lamm said. "However this may be a situation where they just took it too far, and an appellate court may well determine that the court's reliance on the A.I. video could constitute reversible error and require a resentencing." Ms. Wales emphasized that the video of her brother's likeness was used during only the sentencing phase of the case, not in either of Mr. Horcasitas's two trials. Both ended with convictions. He was granted a second trial because prosecutors did not disclose certain evidence during the first, according to court records. On Nov. 13, 2021, Mr. Pelkey was stopped at a red light in Chandler when Mr. Horcasitas pulled up behind him and honked at him, prompting Mr. Pelkey to exit his vehicle and approach Mr. Horcasitas's Volkswagen and gesture with his arms as if to say "what the heck," according to a probable cause statement. Mr. Horcasitas then fired a gun at him, hitting Mr. Pelkey at least once in the chest. Cynthia Godsoe, a professor at Brooklyn Law School and a former public defender who helps write best practices for lawyers for the American Bar Association, said in an interview on Thursday that she was troubled by the allowance of A.I. at the sentencing. "It's clearly going to inflame emotions more than pictures," Ms. Godsoe said. "I think courts have to be really careful. Things can be altered. We know that. It's such a slippery slope." In the U.S. federal courts, a rule-making committee is currently considering evidentiary standards for A.I. materials when parties in cases agree that it is artificially generated, said Maura R. Grossman, a lawyer from Buffalo, who is on the American Bar Association's A.I. task force. Ms. Grossman, a professor at the School of Computer Science at the University of Waterloo, who also teaches at the Osgoode Hall Law School, both in Canada, did not object to the use of A.I. in the Arizona sentencing. "There's no jury that can be unduly influenced," Ms. Grossman said. "I didn't find it ethically or legally troubling." Then there was the curious case of the plaintiff in a recent New York State legal appeal who made headlines when he tried to use an A.I. avatar to argue his case. "The appellate court shut him down," Ms. Grossman said.
[7]
AI lets murder victim address his killer in Arizona courtroom
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. What just happened? We've seen stories in the past of people using artificial intelligence to have conversations with deceased loved ones - or at least the system's interpretation of their personality. Now, AI technology has been used so a man who was murdered in a road rage incident in 2021 could address his killer in court. 37-year-old army veteran Christopher Pelkey was killed by Gabriel Horcasitas at a red light in 2021 in Chandler, Arizona. Pelkey had left his vehicle and was walking back toward Horcasitas' car when he was shot. In what is believed to be the first use of AI to deliver a victim statement, a lifelike simulacrum of the deeply religious Pelkey addressed the man who killed him in an Arizona court. "To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," said Pelkey. "In another life, we probably could have been friends." "I believe in forgiveness, and a God who forgives. I always have, and I still do." Stacey Wales, Pelkey's sister, came up with the idea to use AI in this way as she collected victim impact statements and prepared her own. "We received 49 letters that the judge was able to read before walking into sentencing that day. But there was one missing piece. There was one voice that was not in those letters," she said. "All I kept coming back to was, what would Chris say?" Wales said. Wales poses with the photo of her brother on which the AI-generated video is based (credit: Fox 10) Unlike other instances of generative AI being used to speak to deceased individuals, Wales wrote the script that her brother delivered. The technology was used to create a video of an older version of Pelkey, based on a photograph provided by the family, and put the words into his mouth, making this more like a deepfake - albeit one created for a good cause. This was one of the rare cases where a judge welcomed the use of AI in a courtroom. Judge Todd Lang said "I loved that AI, thank you for that. As angry as you are, as justifiably angry as the family is, I heard the forgiveness." Pelkey's brother John was equally pleased, saying that seeing his brother's face made him feel "waves of healing." Lang sentenced Horcasitas to 10-and-a-half years in prison on manslaughter charges. Most of the instances of AI being used in courtrooms haven't gone well. Back in 2023, what was set to be the first case of an AI "robot lawyer" used in a court of law never materialized after the CEO behind it was threatened with jail time. There have also been several instances of human lawyers using generative AI to file briefs containing nonexistent cases. A case this year led to a $15,000 fine for the lawyer involved. In June 2023, two lawyers and their law firm were fined $5,000 by a district judge in Manhattan for citing fake legal research generated by ChatGPT.
[8]
AI version of dead Arizona man addresses killer during sentencing
While some experts argue the unique use of AI is just another step into the future, others say it could become a slippery slope for using the technology in legal cases. His family used voice recordings, videos and pictures of Mr Pelkey, who was 37 when he was killed, to recreate him in a video using AI, his sister Stacey Wales told the BBC. Ms Wales said she wrote the words that the AI version read in court based on how forgiving she knew her brother to be. "To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," said the AI version of Mr Pelkey in court. "In another life, we probably could have been friends." "I believe in forgiveness, and a God who forgives. I always have and I still do," the AI verison of Mr Pelkey - wearing a grey baseball cap - continues. The technology was used at his killer's sentencing - Horcasitas already had been found guilty by a jury - some four years after Horcasitas shot Mr Pelkey at a red light in Arizona. The Arizona judge who oversaw the case, Todd Lang, seemed to appreciate the use of AI at the hearing. He sentenced Horcasitas to 10-and-a-half years in prison on manslaughter charges. "I loved that AI, thank you for that. As angry as you are, as justifiably angry as the family is, I heard the forgiveness," Judge Lang said. "I feel that that was genuine." Paul Grimm, a retired federal judge and Duke Law School professor, told the BBC he was not surprised to see AI used in the Horcasitas sentencing. Arizona courts, he notes, already have started using AI in other ways. When the state's Supreme Court issues a ruling, for example, it has an AI system that makes those rulings digestible for people. And Mr Grimm said because it was used without a jury present, just for a judge to decide sentencing, the technology was allowed. "We'll be leaning [AI] on a case-by-case basis, but the technology is irresistible," he said. But some experts like Derek Leben, a business ethics professor at Carnegie Mellon University, are concerned about the use of AI and the precedent this case sets. While Mr Leben does not question this family's intention or actions, he worries not all uses of AI will be consistent with a victim's wishes. "If we have other people doing this moving forward, are we always going to get fidelity to what the person, the victim in this case, would've wanted?" Mr Leben asked. For Ms Wales, however, this gave her brother the final word. "We approached this with ethics and morals because this is a powerful tool. Just like a hammer can be used to break a window or rip down a wall, it can also be used as a tool to build a house and that's how we used this technology," she said.
[9]
Sister creates AI video of slain brother to address his killer in court
Stacey Wales played the AI-generated video of her brother as a victim impact statement during the sentencing of his killer in an Arizona court. Stacey Wales had a daunting task ahead of her: preparing a victim impact statement for the sentencing of the man who had fatally shot her brother in a road-rage incident in 2021. She wondered how to convey the weight of her loss. "The victims attorney said to us, 'Try to bring him to life,'" Wales said. So Wales turned to artificial intelligence. At the May 1 court hearing in Arizona, she played a video of her brother, Christopher Pelkey. "Just to be clear for everyone seeing this," the avatar of Pelkey said. "I am a version of Chris Pelkey re-created through AI." The facsimile of Pelkey thanked the judge and told his killer he believed in forgiveness, saying that "in another life, we probably could have been friends." He ended the video with a farewell to his family: "Well, I'm going to go fishing now." It wasn't a perfect likeness of Pelkey. His face moved stiffly, and his voice was clipped. But the video moved his family and friends and stirred the judge, who said he "loved that AI" in his closing remarks. "I feel that that was genuine," said Todd Lang, the Maricopa County Superior Court judge who ruled in the case. He sentenced Pelkey's killer to 10 and a half years in prison, the maximum for manslaughter -- which Wales had asked for. Wales's video joins a growing list of cases in which parties have brought generative artificial intelligence into the courtroom. Experts said the AI footage of Pelkey was striking for its novelty -- and how well it was received. "This definitely caught a number of us by surprise," said Diana Bowman, a law professor at Arizona State University. Pelkey was killed in a road rage incident in Chandler, Arizona, in November 2021, court records show. While stopped at a red light, Pelkey left his car and approached another car whose driver had honked repeatedly at him. That driver, Gabriel Horcasitas, shot and killed him as he approached. A jury convicted Horcasitas of manslaughter in March. As his sentencing approached, Wales contacted Pelkey's friends and family and gathered dozens of written statements, video clips and photos to show the judge. Then she thought that she could do more. "I said to myself, 'Well, what if Chris could make his own impact statement?'" Wales said. Wales's husband, Tim Wales, a tech entrepreneur, had experience using generative AI to animate photos and replicate voices. She proposed creating a video of Pelkey. "I won't let it [be published] if it's hokey or flat," Stacey Wales recalled reassuring him at the time. Tim Wales and a friend used AI tools to edit a photo of Pelkey, clone his voice based on old videos of him speaking, and animate his face so his eyes blinked and his mouth moved as he spoke. Wales wrote Pelkey's speech herself -- by hand and without AI, she said -- based on what she thought her brother would say. Wales wanted the toughest sentence allowable for Horcasitas, she said, but she wrote in Pelkey's voice that he "believed in forgiveness and God who forgives." Then she showed her victims attorney, Jessica Gattuso. "I thought it was very effective," Gattuso said. "It was appropriate. I didn't know what kind of objections we might get or pushback. ... I did kind of prepare for that." But no one objected when Wales played the video in court after dozens of other friends and family members gave their own tributes to Pelkey. Wales kept the video a surprise to her family. She also did not disclose it to the judge or Horcasitas's attorneys; Arizona law does not require that, Gattuso said. The video appeared to resonate with Lang, who praised it before delivering Horcasitas's sentence. Lang requested a copy of the video to show his peers a few days after the hearing, Wales and Gattuso said. Wales fared better in bringing AI-generated video into the courtroom than others who did so in different contexts. A New York man was scolded for using an AI avatar to represent him in an employment dispute in March. A Washington state judge rejected bystander video submitted as evidence in a triple murder case last year because it was enhanced with AI tools. Bowman, the law professor, said Wales's case avoided controversy probably because the video was introduced during a sentencing and wasn't being used to determine the defendant's guilt. It also helped that Wales, unlike the New York man, clearly introduced her video as AI-generated. Gary Marchant, a professor of law, ethics and emerging technologies at Arizona State, said attorneys might have objected to showing a video that fabricates a victim's voice to a jury. "In most cases, it's going to be possibly misleading and prejudicial, probably," Marchant said. "So I think it's dangerous to start using non-real evidence that is created by an AI, even though, in this particular case, I'm kind of sympathetic to it." Arizona's highest court is open to bringing AI into the legal process, state Supreme Court Chief Justice Ann Timmer said. The court formed an AI committee to investigate the risks of parties fabricating AI-generated evidence but has also begun using AI-generated avatars to explain court rulings on YouTube. Timmer declined to comment on Wales's video but said any problems that arise from using AI-generated evidence during a sentencing would be decided under the state's existing guidelines for victim-impact statements. "You can make statements that even can be emotional, but you can't go so far as to deprive someone of a fundamentally fair trial," Timmer said. Wales said she didn't think it was unfair to give a voice to her brother in court. The video would help keep his memory alive and gave her family closure after a long criminal trial, she said. "Of course, AI is uncanny," Wales said. "... But in this moment, for Chris to be able to speak on his behalf, it was absolutely worth it."
[10]
AI clone of murder victim confronts killer in US courtroom
AI-clone of Chris Pelkey delivering victim impact statement. 37-year-old Chris Pelkey tragically died due to a senseless road rage incident in Chandler, Arizona, in 2021. Earlier this month, during his killer's sentencing in court, Pelkey was virtually "present" through the use of artificial intelligence. The family used AI, voice recordings, videos, and photographs to reconstruct his image and voice. BBC reported that the family's goal in using AI to recreate Chris was to enable him to give his victim impact statement.
[11]
'I Loved That AI:' Judge Moved by AI-Generated Avatar of Man Killed in Road Rage Incident
How the sister of Christopher Pelkey made an avatar of him to testify in court. An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: "To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," the AI avatar of Christopher Pelkey said. "In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do." It was the first time the AI avatar of a victim -- in this case, a dead man -- has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings. The avatar was made by Pelkey's sister, Stacey Wales. Wales tells 404 Media that her husband, Pelkey's brother-in-law, recoiled when she told him about the idea. "He told me, 'Stacey, you're asking a lot.'"
[12]
A Judge Accepted AI Video Testimony From a Dead Man
How the sister of Christopher Pelkey made an avatar of him to testify in court. An AI avatar made to look and sound like the likeness of a man who was killed in a road rage incident addressed the court and the man who killed him: "To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," the AI avatar of Christopher Pelkey said. "In another life we probably could have been friends. I believe in forgiveness and a God who forgives. I still do." It was the first time the AI avatar of a victim -- in this case, a dead man -- has ever addressed a court, and it raises many questions about the use of this type of technology in future court proceedings. The avatar was made by Pelkey's sister, Stacey Wales. Wales tells 404 Media that her husband, Pelkey's brother-in-law, recoiled when she told him about the idea. "He told me, 'Stacey, you're asking a lot.'" Gabriel Horcasitas killed Christopher Pelkey in 2021 during a road rage incident. Horcasitas was found guilty in March and faced a sentencing hearing earlier this month. As part of the sentencing, Pelkey's friends and family filed statements about how his death affected them. In a first, the Arizona court accepted an AI-generated video statement in which an avatar made to look and sound like Pelkey spoke.
[13]
After an Arizona man was shot, an AI video of him addresses his killer in court
For two years, Stacey Wales kept a running list of everything she would say at the sentencing hearing for the man who killed her brother in a road rage incident in Chandler, Ariz. But when she finally sat down to write her statement, Wales was stuck. She struggled to find the right words, but one voice was clear: her brother's. "I couldn't help hear his voice in my head of what he would say," Wales told NPR. That's when the idea came to her: to use artificial intelligence to generate a video of how her late brother, Christopher Pelkey, would address the courtroom and specifically the man who fatally shot him at a red light in 2021. On Thursday, Wales stood before the court and played the video -- in what AI experts say is likely the first time the technology has been used in the U.S. to create an impact statement read by an AI rendering of the deceased victim. A sister looking for the right words Wales has been thinking about her victim impact statement since the initial trial in 2023. The case was retried in 2025 because of procedural problems with the first trial. The chance to speak in court meant a great deal to Wales, who held back her emotions throughout both trials to avoid influencing the jury. "You're told that you cannot react, you cannot emote, you cannot cry," she said.  "We looked forward to [sentencing] because we finally were gonna be able to react." Wales' attorney told her to humanize Pelkey and offer a complete picture of who he was. So Wales went on a mission. She said she contacted as many people from Pelkey's life -- from his elementary school teacher to high school prom date to the soldiers he served alongside in Iraq and Afghanistan. In total, Wales gathered 48 victim impact statements -- not counting her own. When it was time to write hers, she was torn between saying how she truly felt and what she thought the judge would want to hear. "I didn't wanna get up there and say, 'I forgive you,' 'cause I don't, I'm not there yet," she said. "And the dichotomy was that I could hear Chris' voice in my head and he's like, 'I forgive him.'" Pelkey's mantra had always been to love God and love others, according to Wales. He was the kind of man who would give the shirt off his back, she said. While she struggled to find the right words for herself, Wales said writing from his perspective came naturally. "I knew what he stood for and it was just very clear to me what he would say," she added. A digitally trimmed beard and an inserted laugh That night, Wales turned to her husband Tim, who has experience using AI for work. "He doesn't get a say. He doesn't get a chance to speak," Wales said, referring to her brother. "We can't let that happen. We have to give him a voice." Tim and their business partner Scott Yentzer had only a few days to produce the video. The challenge: there's no single program built for a project like this. They also needed a long, clear audio clip of Pelkey's voice and a photo of him looking straight to the camera -- neither of which Wales had. Still, using several AI tools, Wales' husband and Yentzer managed to create a convincing video using about a 4.5-minute-video of Pelkey, his funeral photo and a script that Wales prepared. They digitally removed the sunglasses on top of Pelkey's hat and trimmed his beard -- which had been causing technological issues. Wales, who was heavily involved in making sure the video felt true to life, said recreating her brother's laugh was especially tough because most clips of Pelkey were filled with background noise. The experience made Wales reflect on her own mortality. So one evening, Wales stepped into her closest and recorded a nine-minute-video of herself talking and laughing -- just in case her family ever needs clear audio of her voice someday. "It was a weird out-of-body experience to think that way about your own mortality, but you never know when you're going to not be here," she said. The night before the sentencing hearing, Wales called her victim rights attorney, Jessica Gattuso, to tell her about the video. Gattuso told NPR that she was initially hesitant about the idea because she had never heard of it being done before in Arizona court. She was also worried that the video may not be received well. But after seeing the video, she felt compelled that it should be viewed in court. "I knew it would have an impact on everyone including the shooter, because it was a message of forgiveness," Gattuso said. The AI generated video helped with healing, sister says Ten people spoke in support of Pelkey at the sentencing hearing. The AI-generated video of him went last. "Hello. Just to be clear for everyone seeing this, I'm a version of Chris Pelkey recreated through AI that uses my picture and my voice profile," the AI avatar said. The video went on to thank everyone in Pelkey's life who contributed an impact statement and attended the hearing. Then, the video addressed his shooter, Gabriel Paul Horcasitas. "It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God who forgives. I always have and I still do," the video said. The video ended with the avatar encouraging everyone to love one another and live life to the fullest. "Well, I'm gonna go fishing now. Love you all. See you on the other side," it concluded. Neither the defense nor the judge pushed back. Later in the hearing, Judge Todd Lang said, "I loved that AI. Thank you for that." He added, "It says something about the family because you told me how angry you were and you demanded the maximum sentence. And even thought that's what you wanted, you allowed Chris to speak from his heart, as you saw it. I didn't hear him asking for the maximum sentence." Horcasitas received 10.5 years for manslaughter. Wales said she didn't realize how deeply the video would affect her and her family. For her teenage son, it was a chance to hear his uncle say goodbye. For Wales, it gave her the strength to finally look back at photos of her brother. "Going through this process of AI and what he'd sound like and trimming his beard and inserting laughs and all these other things, it was very cathartic and it was part of the healing process," she said. What AI and legal experts say Over the years, there have been a growing number of examples testing the bounds of AI's role in the courtroom. For instance, in 2023, President Trump's former lawyer Michael Cohen unwittingly sent his attorney bogus AI-generated legal citations. More recently, last month, a man attempted to use an AI-generated lawyer avatar in court -- an effort that was quickly shut down by the judge. But the use of AI for a victim impact statement appears novel, according to Maura Grossman, a professor at the University of Waterloo who has studied the applications of AI in criminal and civil cases. She added, that she did not see any major legal or ethical issues in Pelkey's case. "Because this is in front of a judge, not a jury, and because the video wasn't submitted as evidence per se, its impact is more limited," she told NPR via email. Some experts, including Grossman, predict generative AI will become more common in the legal system, but it raises various legal and ethical questions. When it comes to victim impact statements, key concerns include questions around consent, fairness and whether the content was made in good faith. "Victim statements like this that truly try to represent the dead victim's voice are probably the least objectionable use of AI to create false videos or statements," Gary Marchant, a professor of law, ethics and emerging technologies at Arizona State University's Sandra Day O'Connor College of Law, wrote in an email. "Many attempts to use AI to create deep fakes will be much more malevolent," he added. Wales herself cautions people who may follow in her footsteps to act with integrity and not be driven by selfish motives. "I could have been very selfish with it," she said. "But it was important not to give any one person or group closure that could leave somebody else out."
[14]
Slain Man's AI-Generated Avatar Delivered a First-of-its-Kind Victim Statement in Court
An AI-generated video avatar of a man shot and killed in a 2021 road rage incident was presented in court as an unprecedented type of victim statement. The presiding judge claims the video moved him. As 404 Media reports, Christopher Pelkey was shot and killed in 2021 by Gabriel Paul Horcasitas following a road rage incident. As part of a family victim statement, Christopher Pelkey's AI-generated likeness and voice spoke directly to the shooter, suggesting that had the two met under different circumstances, they could probably have been friends. The AI-generated Pelkey offered further advice to people in general, advising people to make the most of their time and tell people they love them. Pelkey's sister, Stacey Wales, made the AI-generated version of Christopher with the help of her husband, Tim Wales, and their friend, Scott Yentzer. Stacey Wales wrote the script, provided the images for the AI-generated likeness, and used audio from a prerecorded interview that Christopher provided months before his death. Maricopa County Superior Court Judge Todd Lang reacted positively to the video. "I loved that AI, and thank you for that," Lang said. "As angry as you are, and as justifiably angry as the family is, I heard the forgiveness, and I know Mr. Horcasitas could appreciate it, but so did I." "I love the beauty in what Christopher, and I call him Christopher -- I always call people by their last names, it's a formality of the court -- but I feel like calling him Christopher as we've gotten to know him today. I feel that that was genuine, because obviously the forgiveness of Mr. Horcasitas reflects the character I heard about today. But it also says something about the family, because you told me how angry you were, and you demanded the maximum sentence. And even though that's what you wanted, you allowed Chris to speak from his heart as you saw it. I didn't hear him asking for the maximum sentence," Lang continued, before sentencing Horcasitas to the maximum possible penalty for his manslaughter conviction -- 10.5 years. The prosecution had asked for nine. As 404 Media and The Associated Press report, Arizona law provides significant flexibility for what form victim impact statements can take. Pelkey's sister told 404 Media she struggled to write her own victim impact statement for the case before she decided to use AI to give her brother a voice in court. She and her husband, Tim, discussed the idea and ultimately decided it would work. "We talked about it and [Tim] says, 'You know you have to be careful with this stuff. In the wrong hands it can send the wrong message,'" Stacey told 404 Media. "He says, 'Because without the right script, this will fall short. It will be flat and hokey and I'm not going to let it go out if it's not authentic.'" "Our goal was to make the judge cry. Our goal was to bring Chris to life and humanize him," Stacey Wales said. Horcasitas' lawyer, Jason Lamm, told AP that they had filed a notice to appeal Horcasitas' sentence shortly after the sentencing hearing. The lawyer reportedly said the appellate court will likely consider whether the judge improperly relied on the AI video presented in court when determining the sentence. "There's a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used," Gary Merchant, a member of Airzona's new committee on AI in courts and Arizona State University law professor, told AP. "It's easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see."
[15]
A murder victim addressed his killer in court thanks to AI resurrection
Digital resurrection projects -- using artificial intelligence to bring back the likeness of people who have died -- have become a trend for at least two years. And, as AI gets more advanced, so do the resurrections. Most recently, Stacey Wales used AI to generate a video of her late brother, Christopher Pelkey, to address the courtroom at the sentencing hearing for the man who killed him in a road rage incident in Chandler, Arizona. According to NPR, its the first time AI has ever been used in this way. "He doesn't get a say. He doesn't get a chance to speak," Wales told NPR, referring to her brother. "We can't let that happen. We have to give him a voice." Pelkey was a veteran who served as a sergeant in the U.S. Army, according to an online obituary. He was also heavily involved in his local church, and he went on multiple mission trips. His sister told NPR that he loved God, loved others, and would give a stranger the shirt off his back. He was 37 when he died. Wales created the AI video of her brother in a few days, but she didn't come up with the idea immediately. After two years of trying to craft a victim impact statement, Wales said she had the epiphany that the only voice that mattered was her late brother's. "Every time I'd get in the shower or the car and my thoughts were quiet, I wrote down what I was feeling -- frustrated, crying or emotions, yelling, anger, love, anything that I could think of," she told NBC News. "I've been writing it for two years, but I never had the idea to help Chris speak until a week and a half before this second trial." Wales also posted the AI video of her brother online, and you can watch the same video shown in the courtroom. "Hello. Just to be clear for everyone seeing this, I'm a version of Chris Pelkey recreated through AI that uses my picture and my voice profile," the AI avatar said in the video. AI Pelkey thanked everyone in his life, and said he and his shooter, Gabriel Paul Horcasitas, "could have been friends" in "another life." "Well, I'm gonna go fishing now. Love you all. See you on the other side," AI Pelkey said at the end of the video. According to NPR, Maricopa County Superior Court Judge Todd Lang said, "I loved that AI. Thank you for that." He gave Horcasitas the maximum sentence of just over a decade in prison for manslaughter. This isn't the first time people have pushed the limits of AI to create versions of people who have died. It's a phenomenon particularly beloved by TikTok true crime fans, as Rolling Stone reported in 2023. And just last year, youth-focused gun reform organizations March For Our Lives and Change the Ref used audio "deepfakes" to "resurrect" gun violence victims in a campaign to Congress.
[16]
Dead Arizona road rage victim addresses killer in court through AI
Three-and-a-half years later, Pelkey appeared in an Arizona court to address his killer. Sort of. "To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," says a video recording of Pelkey. "In another life, we probably could have been friends." "I believe in forgiveness, and a God who forgives. I always have, and I still do," Pelkey continues, wearing a grey baseball cap and sporting the same thick red and brown beard he wore in life. Pelkey was 37 years old, devoutly religious and an army combat veteran. Horcasitas shot Pelkey at a red light in 2021 after Pelkey exited his vehicle and walked back towards Horcasitas's car. Pelkey's appearance from beyond the grave was made possible by artificial intelligence in what could be the first use of AI to deliver a victim impact statement. Stacey Wales, Pelkey's sister, told local outlet ABC-15 that she had a recurring thought when gathering more than 40 impact statements from Chris's family and friends. "All I kept coming back to was, what would Chris say?" Wales said. As AI spreads across society and enters the courtroom, the US judicial conference advisory committee has announced that it will begin seeking public comment as part of determining how to regulate the use of AI-generated evidence at trial. Wales and her husband fed an AI model videos and audio of Pelkey to try and come up with a rendering that would match the sentiments and thoughts of a still-alive Pelkey, something that Wales compared with a "Frankenstein of love" to local outlet Fox 10. Judge Todd Lang responded positively to the AI usage. Lang ultimately sentenced Horcasitas to 10-and-a-half years in prison on manslaughter charges. "I loved that AI, thank you for that. As angry as you are, as justifiably angry as the family is, I heard the forgiveness," Lang said. "I feel that that was genuine." Also in favor was Pelkey's brother John, who said that he felt "waves of healing" from seeing his brother's face, and believes that Chris would have forgiven his killer.
[17]
AI version of dead Arizona road rage victim addresses killer in court - video
Chris Pelkey was killed in a road rage shooting in Chandler, Arizona, in 2021. Three-and-a-half years later, Pelkey appeared in an Arizona court to address his killer. Sort of. 'To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,' says a video recording of Pelkey. 'In another life, we probably could have been friends.' Pelkey continues: 'I believe in forgiveness, and a God who forgives. I always have, and I still do.' Pelkey was 37 years old, devoutly religious and an army combat veteran. Horcasitas shot Pelkey at a red light in 2021 after Pelkey exited his vehicle and walked back towards Horcasitas's car. Pelkey's appearance from beyond the grave was made possible by artificial intelligence in what could be the first use of AI to deliver a victim impact statement.
[18]
AI enables slain man to address courtroom at killer's sentencing
In what's believed to be a world first, artificial intelligence (AI) has allowed a slain man to address his killer at the sentencing hearing. Christopher Pelkey was shot dead in a road rage incident in Chandler, Arizona, four years ago, but just recently, AI was used to recreate a digital version of the victim that was allowed to make a statement during court proceedings, a local news site reported. Recommended Videos The video presentation also included real clips of Pelkey to give those in court a clearer understanding of his personality. Some of these clips were also used to create the AI-generated likeness of Pelkey, which you can see below. In the video played in court, the AI version of Pelkey says: "To Gabriel Horcasitas, the man who shot me -- it is a shame we encountered each other that day in those circumstances." He continues: "I'm a version of Chris Pelkey recreated through AI that uses my picture and my voice profile. In another life, we probably could've been friends. I believe in forgiveness and in God who forgives. I always have and I still do." After watching the video, Judge Todd Lang said: "I love that AI. Thank you for that. I felt like that was genuine, that his obvious forgiveness of Mr. Horcasitas reflects the character I heard about today." The judge then sentenced Horcasitas to ten-and-a-half years for Pelkey's manslaughter. It was Chris Pelkey's sister, Stacey, who came up with the idea to use AI to create a likeness of her brother for use in court. She said it was important "not to make Chris say what I was feeling, and to detach and let him speak because he said things that would never come out of my mouth, but that I know would come out of his." Ann A. Scott Timmer, Chief Justice of the Arizona Supreme Court, commented that AI has the potential "to create great efficiencies in the justice system and may assist those unschooled in the law to better present their positions. For that reason, we are excited about AI's potential." Timmer added: " But AI can also hinder or even upend justice if inappropriately used. A measured approach is best. Along those lines, the court has formed an AI committee to examine AI use and make recommendations for how best to use it ... Those who use AI -- including courts -- are responsible for its accuracy." Indeed, while the use of AI in this way brings a powerful and deeply personal element to court proceedings, it also raises various ethical and legal concerns about authenticity, emotional influence, and appropriate application. As a result, it seems likely that other courts will at some point develop guidelines for future cases, if they choose to allow AI-generated victim statements. Please enable Javascript to view this content
[19]
Family Uses AI To Revive Dead Brother For Impact Statement in Killer's Trial
In Arizona, the family of a man killed during a road rage incident has used artificial intelligence to revive their dead loved one in court -- and the video is just as unsettling as you think. As Phoenix's ABC 15 reports, an uncanny simulacrum of the late Christopher Pelkey, who died from a gunshot wound in 2021, played in a courtroom at the end of his now-convicted killer's trial. "In another life, we probably could have been friends," the AI version of Pelkey, who was 37 when he died, told his shooter, Gabriel Paul Horcasitas. "I believe in forgiveness." Despite that moving missive, it doesn't seem that much forgiveness was in the cards for Horcasitas. After viewing the video -- which was created by the deceased man's sister, Stacey Wales, using an "aged-up" photo Pelkey made when he was still alive -- the judge presiding over the case ended up giving the man a 10-and-a-half year manslaughter sentence, which is a year more than what state prosecutors were asking for. In the caption on her video, Wales explained that she, her husband Tim, and their friend Scott Yenzer made the "digital AI likeness" of her brother using a script she'd written alongside images and audio files they had of him speaking in a "prerecorded interview" taken months before he died. "These digital assets and script were fed into multiple AI tools to help create a digital version of Chris," Wales wrote, "polished by hours of painstaking editing and manual refinement." In her interview with ABC15, Pelkey's sister insisted that everyone who knew her late brother "agreed this capture was a true representation of the spirit and soul of how Chris would have thought about his own sentencing as a murder victim." She added that creating the digital clone helped her and her family heal from his loss and left her with a sense of peace, though others felt differently. "Can't put into words how disturbing I find this," writer Eoin Higgins tweeted of the Pelkey clone. "The idea of hearing from my brother through this tech is grotesque. Using it in a courtroom even worse." Referencing both the Pelkey video and news that NBC is planning to use late sports narrator Jim Fagan's voice to do new promos this coming NBA season, a Bluesky user insisted that "no one better do this to me once I'm dead." "This AI necromancy bullshit is so creepy and wrong," that user put it -- and we must say, it's hard to argue with that.
[20]
The Judge's Reaction to an AI-Generated Victim Impact Statement Was Not What We Expected
A slain Arizona man's family used AI to bring him back from the dead for his killer's sentencing hearing -- and the judge presiding over the case apparently "loved" it. As 404 Media reports, judge Todd Lang was flabbergasted when he saw the AI-generated video of victim Chris Peskey that named and "forgave" the man who killed him in 2021. "To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances," the video, which Peskey's sister Stacey Wales generated, intoned. "In another life we probably could have been friends. I believe in forgiveness, in God who forgives, I always have. And I still do." Found guilty earlier this year, Horcasitas' sentencing was contingent, as many cases are, upon various factors, including impact statements from the victim's family. As Wales told 404 Media, her husband Tim was initially freaked out when she introduced the idea of creating a digital clone of her brother for the hearing and told her she was "asking a lot." Ultimately, the video was accepted in the sentencing hearing, the first known instance of an AI clone of a deceased person being used in such a way. And the gambit appears to have paid off. "I loved that AI, and thank you for that," Lang said, per a video of his pre-sentencing speech. "As angry as you are, and as justifiably angry as the family is, I heard the forgiveness, and I know Mr. Horcasitas could appreciate it, but so did I." "I feel like calling him Christopher as we've gotten to know him today," Lang continued. "I feel that that was genuine, because obviously the forgiveness of Mr. Horcasitas reflects the character I heard about today." Lang acknowledged that although the family itself "demanded the maximum sentence," the AI Pelkey "spoke from his heart" and didn't call for such punishment. "I didn't hear him asking for the maximum sentence," the judge said. Horcasitas' lawyer also referenced the Peskey avatar when defending his client and, similarly, said that he also believes his client and the man he killed could have been friends had circumstances been different. That entreaty didn't seem to mean much, however, to Lang. He ended up sentencing Horcasitas to 10.5 years for manslaughter, which was a year and a half more than prosecutors were seeking. It's a surprising reaction, showing that many are not only open to AI being used this way, but also in favor of it -- evidence that the chasm between AI skeptics and adopters could be widening.
[21]
Back From the Dead: Man Killed in Road Rage Incident 'Testifies' in Court - Decrypt
Christopher Pelkey spoke directly to the man who shot and killed him during a 2021 road rage incident in Arizona -- three-and-a-half years after his death. "To Your Honor, Judge Lang, thank you for making yourself available to see this case to the end. Especially when the rescheduled trial conflicted with your daughter's spring break," the AI version of Christopher Pelkey said in a video presented in the Phoenix, Arizona courtroom. "To Gabriel Horcasitas, the man who shot me: It is a shame we encountered each other that day in those circumstances," Pelkey's voice and likeness rang out in the courtroom. "In another life, we probably could've been friends. I believe in forgiveness and in God who forgives. I always have and I still do." The voice wasn't actually Pelkey's -- it also came from an AI-generated video created by his family for the sentencing hearing. The digital resurrection marked what's believed to be the first time artificial intelligence has been used to deliver a victim impact statement in court. Maricopa County Superior Court Judge Todd Lang was visibly moved by the presentation. "I love that AI. Thank you for that," Lang told the family after watching the video. "I loved the beauty in what Christopher (said)... I felt like that was genuine." Horcasitas was found guilty of manslaughter for shooting 37-year-old Pelkey during a road rage confrontation in Chandler, Arizona in 2021. The Judge sentenced 50-year-old Horcasitas to nearly 13 years in prison -- one year more than what prosecutors had requested. The idea of using the victim's likeness in court came from Pelkey's sister, Stacey Wales, as she collected impact statements from family and friends. She received 49 letters for the judge to read, but that wasn't enough. "There was one missing piece. There was one voice that was not in those letters," Wales told Fox 10 Phoenix. Creating the AI version of her brother wasn't simple. Wales turned to her husband Tim and their friend Scott Yentzer, who have been working with AI technology for years. The team cobbled together various tools to bring Pelkey back to life -- what Wales called "a Frankenstein of love." More challenging than dealing with the technical aspects was deciding what Pelkey would actually say. Wales had strong opinions about the sentencing and whether to forgive Horcasitas, but claims she did her best to be fair. "It was important not to make Chris say what I was feeling, and to detach and let him speak -- because he said things that would never come out of my mouth, but I know would come out his," Wales explained. The AI simulation wasn't perfect, but was good enough to trigger emotional responses in those who watched it. The video even included a photo of Pelkey that had been run through an "old age" filter to show what he might have looked like had he lived. "This is the best I can ever give you of what I would have looked like if I got the chance to grow old," the AI version of Pelkey said. "Remember, getting old is a gift that not everybody has, so embrace it and stop worrying about those wrinkles." Of course, the concept of using a deepfake image of a dead person to manipulate judges and jury is a new kind of problem that only generative AI could have created. Arizona Chief Justice Ann Timmer noted in a statement to ABC Arizona that while AI offers potential benefits, it could also "hinder or even upend justice if inappropriately used." And that's a double-edged sword that could be exploited in the future, especially as AI video generators improve in quality to produce almost lifelike outputs -- and people use it more and more to animate photos of their deceased loved ones. Arizona State professor of law Gary Marchant also questioned the use of AI in courts. "If you look at the facts of this case, I would say that the value of it [outweighed] the prejudicial effect," he told the local news outlet AZ Family, "but if you look at other cases, you could imagine where they would be very prejudicial."
[22]
Man murdered in 2021 "speaks" at killer's sentencing hearing thanks to AI video
A man who was killed in a road rage incident in Arizona "spoke" during his killer's sentencing hearing after his family used artificial intelligence to create a video of him reading a victim impact statement. In what's believed to be a first in U.S. courts, the family of Chris Pelkey used AI to give him a voice. Pelkey was shot by Gabriel Paul Horcasitas on Nov. 13, 2021, as both drivers were stopped at a red light. According to records, Pelkey was shot after getting out of his truck and walking toward Horcasitas' car. The AI rendering of Pelkey told the shooter during the sentencing hearing last week that it was a shame they had to meet under those circumstances -- and that the two of them probably could have been friends in another life. "I believe in forgiveness and in God who forgives. I always have and I still do," Pelkey's avatar told Gabriel Paul Horcasitas. The AI version of Pelkey went on to share advice for people to make the most of each day and to love each other, not knowing how much time one might have left. While use of artificial intelligence within the court system is expanding, it's typically been reserved for administrative tasks, legal research and case preparation. In Arizona, it's helped inform the public of rulings in significant cases. But using AI to generate victim impact statements marks a new -- and legal, at least in Arizona -- tool for sharing information with the court outside the evidentiary phases. Maricopa County Superior Court Judge Todd Lang, who presided over the case, said that he "loved that AI," according to CBS News partner BBC News. Lang also noted the video said something about Pelkey's family, who had expressed their anger over his death and had asked for Horcasitas to receive the maximum sentence. Family and friends also submitted nearly 50 letters to the judge, echoing the video's message. Horcasitas, 54, was convicted of manslaughter and sentenced to 10.5 years in prison. Horcasitas' lawyer, Jason Lamm, told The Associated Press they filed a notice to appeal his sentence within hours of the hearing. Lamm said it's likely that the appellate court will weigh whether the judge improperly relied on the AI video when handing down the sentence. Pelkey's sister, Stacey Wales, raised the idea of her brother speaking for himself. For years, while the case worked its way through the legal system, Wales said she thought about what she would say at the sentencing hearing. She struggled to get words down on paper. But when she thought about what her brother would say to the shooter, knowing he would have forgiven him, the words poured out of her. "We approached this with ethics and morals because this is a powerful tool," she told the BBC. "Just like a hammer can be used to break a window or rip down a wall, it can also be used as a tool to build a house and that's how we used this technology." Pelkey was born in Poughkeepsie, New York, and later lived in Arizona, according to his obituary. He was a veteran who served three tours in Iraq and Afghanistan and was actively involved in his local church and went on local and international mission trips, the obituary said. In Arizona, victims can give their impact statements in any digital format, said victims' rights attorney Jessica Gattuso, who represented the family. Arizona Supreme Court Chief Justice Ann Timmer didn't address the road rage case specifically in an interview Wednesday. But she said the rise in popularity and accessibility to AI in recent years led to the creation of a committee to research best practices in the courts. Gary Marchant, a member of the committee and a law professor at Arizona State University, said he understands why Pelkey's family did it. But he warned the use of this technology could open the door to more people trying to introduce AI-generated evidence into courtrooms. "There's a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used," he said. "It's easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see." Marchant pointed to a recent case in New York, where a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen didn't exist at all.
[23]
Road rage victim 'speaks' via A.I. at his killer's sentencing
Christopher Pelkey in an A.I. generated victim impact statement recently used in court.Courtesy Pelkey Family The road-rage killer of an Arizona man was sentenced to 10.5 years behind bars after his victim spoke to the court, via artificial intelligence, in what could be the first-of-its-kind use of this technology, officials said Wednesday. Maricopa County Superior Court Judge Todd Lang on Thursday gave the maximum sentence to Gabriel Paul Horcasitas for the fatal shooting of Christopher Pelkey, 37, on Nov. 13, 2021, prosecutors said. Horcasitas, now 54, was convicted of manslaughter and endangerment earlier this year. Lang allowed Pelkey's loved ones to play an A.I.-generated version of the victim -- his face, body and a lifelike voice that appeared to ask the judge for leniency. "To Gabriel Horcasitas, the man who shot me: It is a shame we encountered each other that day in those circumstances," the artificial version of Pekley said. "In another life, we probably could have been friends. I believe in forgiveness." The idea of using an A.I. version of Pelkey came from his family, not the state, according to the man's loved ones and a Maricopa County Attorney's Office spokesperson. The victim's sister Stacey Wales and brother-in-law both work in the A.I. field. When Wales suggested bringing her late brother to life like this, she said her husband was more than hesitant. "He recoiled," Wales told NBC News. "And he said, 'Stacey, do you know what you're asking me to do? This is my best friend.' And I said, 'I know. It's my brother.' And then he said, 'If this isn't perfect, if this doesn't go out and really embody the spirit of Chris, I'm not going to let this be shown.'" Horcasitas had been convicted of manslaughter and endangerment at trial in spring 2023. But a new trial was ordered when a judge ruled that prosecutors failed to property disclose potentially key evidence in a timely manner. Wales said she hadn't come up with this idea in 2023. After two years of trying to craft a victim-impact statement, Wales said, she had the epiphany that the only voice that mattered was her late brother's. "Every time I'd get in the shower or the car and my thoughts were quiet, I wrote down what I was feeling -- frustrated, crying, or emotions, yelling, anger, love, anything that I could think of," she said. "I've been writing it for two years, but I never had the idea to help Chris speak until a week and a half before this second trial." She added: "What I had to say did not seem like it would do justice to the last person listening to make a decision on Chris' life." Horcasitas faced between 7 years and 10.5 years in prison. The defense asked for the lowest punishment. The judge delivered the maximum but he acknowledged the words in the presentation. "And as angry as you are, justifiably angry as the family is, I heard the forgiveness," Judge Lang said. "I feel like that was genuine, that his obvious forgiveness of Mr. Horcasitas reflects the character I heard about (the victim) today." Defense lawyer Jason Lamm said that A.I. presentation created a strong issue for appeal. "While judges certainly have latitude as to what to hear, particularly from victims, an appellate court will have to decide if this was error," Lamm said. "If it was just simply too far over the line in terms of being inflammatory and to what degree the judge relied on it in imposing a sentence on my client." Arizona State University law professor Gary Marchant, who specializes in ethics and emerging technologies, praised the victim's loved ones for producing a work that appeared to be against their self-interest of securing a maximum penalty for Horcasitas. But the ASU professor he's worried about this precedent this set. "The family did a really good job of representing what he would have said and they would have the best sense of what he would have said, " Marchant said. "But on the other hand, it's completely fake, right? It's not true." While prosecutors and defense lawyers have long used visual aides, charts and other illustrations to make their points, Marchant said A.I. presents a new ethical challenges. "I mean, it's a blurry line, right," said Marchant, who sits on a state Supreme Court committee advising on use of A.I. "You see someone speaking who isn't really speaking, right? You see that person in the courtroom actually speaking and in reality, they're dead and they're not speaking. So this is an extra jump that I feel is going to get us into dangerous grounds."
[24]
AI brought a road rage victim 'back to life' in court. Experts say it went too far
To do so, they turned to technology: An AI-generated video featuring a re-created voice and likeness of Pelkey was presented as a victim impact statement ahead of sentencing. The video showed a digitally resurrected Pelkey appearing to speak directly to the judge. Of course, the statement wasn't truly Pelkey's. He couldn't have possibly said those words -- he died the day Horcasitas shot him. Yet the judge accepted the AI-generated message, even acknowledging its effect. "You allowed Chris to speak from his heart as you saw it," the judge said. Horcasitas was sentenced to 10 and a half years in prison. The extraordinary courtroom moment has sparked widespread discussion, not just for its emotional power but for the precedent it may set. Arizona's victims' rights laws allow the families of deceased victims to determine how their impact statements are delivered. But legal and AI experts warn that this precedent is far from harmless.
[25]
Family uses AI video in court to allow victim to give impact statement - SiliconANGLE
Family uses AI video in court to allow victim to give impact statement In what is a first in Arizona judicial history, possibly nationwide, artificial intelligence has brought a lifelike rendering of a deceased victim for an impact statement during a sentencing hearing. Christopher Pelkey, 37, was killed in a road rage incident in Chandler, AZ, in 2021. Last week, his killer was sentenced to 10.5 years in prison for manslaughter. During sentencing, a nearly four-minute video was played in court, showing a digital recreation of Pelkey delivering a statement to his killer. Victim impact statements are typically used during sentencing to inform the court about the emotional, psychological and financial consequences of a crime on the people it affected. Such statements are usually delivered to the family of victims to affect the sentencing of the convicted by sharing personal experiences and perspectives. To create the video, Pelkey's family used AI technology to use his likeness and to give him a voice, according to an article in ABC News. It was generated using a single photograph and audio from a YouTube video where he discussed PTSD. It displayed his head and torso, showing him facing the camera, wearing a hoodie and a ball cap. "Just to be clear for everyone seeing this, I am a version of Chris Pelkey recreated through AI that uses my picture and my voice profile," Pelkey's likeness said. The script for the video, written by Pelkey's sister, sought to embody his penchant for forgiveness and compassion. "It is a shame we encountered each other that day in those circumstances," the avatar of Mr. Pelkey said. "In another life, we probably could have been friends. I believe in forgiveness and in God, who forgives. I always have and I still do." As AI has begun to wind itself into society at large, it has also woven itself into the legal system. There has been a growing number of examples of its use in the courtroom. More lawyers have gotten themselves in trouble using AI to prepare legal briefs and causing them to cite AI hallucinated case citations. This includes Morgan & Morgan, one of the top law firms in the United States, which warned its attorneys to be more careful after one of its lawyers blindly cited AI-faked case law earlier this year. In another instance, a man in a New York appeals court acting as his own lawyer attempted to submit an AI-generated video to present his argument to the assembled panel of judges, without letting them know it was an AI avatar. According to The Associated Press, the justices roundly rejected the AI avatar speaking for him, saying they did not appreciate being misled. Officials at the Arizona Supreme Court started using two AI-generated spokespeople in March to deliver court news in brief videos, such as case decisions and opinions. Although not directly related to the legal profession, it shows a growing comfort with AI entering the lives of court workers. As for the use of AI in sentencing, it is completely new ground. Cynthia Godsoe, a professor at Brooklyn Law School and a former public defender who assists the American Bar Association with best practices, told the New York Times in an interview that she thought the use of AI was troubling. "It's clearly going to inflame emotions more than pictures," Godsoe said. "I think courts have to be really careful. Things can be altered. We know that. It's such a slippery slope." Other experts, such as Maura Grossman, a professor at the University of Waterloo, who studies the applications of AI in criminal and civil cases, told NPR that there probably weren't any significant legal or ethical issues in the Pelkey case. "Because this is in front of a judge, not a jury, and because the video wasn't submitted as evidence per se, its impact is more limited," she said.
[26]
An Arizona family used AI to recreate a road rage victim's voice
The family of a man killed in a 2021 road rage incident in Arizona used artificial intelligence to portray the victim delivering his own impact statement during his killer's sentencing hearing, according to local news reports. Christopher Pelkey's sister, brother-in-law, and their friend used AI technology to recreate his likeness, reportedly drawing from video clips recorded while he was alive. It is believed to be one of the first -- if not the very first -- instances of an AI-generated victim impact statement being used in court. "To Gabriel Horcasitas, the man who shot me: it is a shame we encountered each other that day in those circumstances," the artificial 37-year-old said in the video. "In another life, we probably could've been friends. I believe in forgiveness and in God who forgives. I always have, and I still do." Judge Todd Lang appreciated the video, according to Fox 10 News. Prosecutors requested a 9.5-year sentence for Horcasitas; ABC 15 reported that he was ultimately sentenced to more than a decade for manslaughter.
[27]
Murder victim 'speaks' beyond the grave in AI generated video at court heariniig
The family of an Arizona man killed in a road rage incident nearly four years ago brought him back last week as an AI-generated image to face the man responsible for his killing give an impact statement to the judge. The video message created by Christopher Pelkey's sister that used his likeness and voice during the May 1 sentencing was the first time the technology was used in an Arizona court at a sentencing, according to records. Pelkey was killed in November 2021 by Gabriel Paul Horcasitas, who was ultimately convicted of manslaughter charges. The AI-generated Pelkey spoke to Horcasitas in court and sought forgiveness. "In another life, we probably could have been friends," the avatar said in the video. "I believe in forgiveness and in God who forgives. I always have and I still do." Stacey Wales, Pelkey's sister, told ABC affiliate KNXV that the slain victim's friends and family "agreed this capture was a true representation of the spirit and soul of how Chris would have thought about his own sentencing as a murder victim." Wales said she wrote the script for the video and noted that her brother was a forgiving, God-fearing man. Dozens of other family members also provided victim impact statements and expressed anger over Horcasitas' actions. Prosecutors asked the judge for Horcasitas to be sentenced to nine and a half years in prison, but Judge Todd Lang ultimately issued a 10 and a half year sentence. Lang said he was moved by the AI-generated video. "I loved that AI, thank you for that. As angry as you are, as justifiably angry as the family is, I heard the forgiveness," the judge said during the sentencing. "I feel that that was genuine." Horcasitas's attorney, Jason Lamm, told ABC News that he was not given advanced notice about the video. He argued in court that Pelkey was the one who instigated the road rage incident and what the judge heard was a "kinder, more gentle" version of Pelkey. "I appreciate the fact that victims have the right to address the court, and this was a cathartic endeavor for Stacey Wells, but this was cringe," Lamm told ABC News. He said he has filed a notice of appeal for his client and that the use of the AI-generated video will likely be one of the points of contention. "This will be a bellwether case not just for Arizona but also courts around the country to rule on the use of AI in victim impact statements," Lamm said. Arizona Supreme Court Chief Justice Ann Timmer provided a statement to KNXV about the use of AI in court cases. "AI has the potential to create great efficiencies in the justice system and may assist those unschooled in the law to better present their positions. For that reason, we are excited about AI's potential. But AI can also hinder or even upend justice if inappropriately used," she said in her statement. "A measured approach is best. Along those lines, the court has formed an AI committee to examine AI use and make recommendations for how best to use it. At bottom, those who use AI -- including courts -- are responsible for its accuracy," she added.
[28]
AI-generated video gave victim a voice at his killer's sentencing in Arizona
CHANDLER, Ariz. -- There were dozens of statements submitted to the court by family and friends of Christopher Pelkey when it came time to sentence the man convicted of fatally shooting him during a road rage incident. They provided glimpses of Pelkey's humor, his character and his military service. But there was nothing quite like hearing from the victim himself -- even if it was a version generated by artificial intelligence. In what's believed to be a first in U.S. courts, Pelkey's family used AI to create a video using his likeness to give him a voice. The AI rendering of Pelkey told the shooter during the sentencing hearing last week in Phoenix that it was a shame they had to meet that day in 2021 under those circumstances -- and that in another life, the two of them probably could have been friends. "I believe in forgiveness and in God who forgives. I always have and I still do," Pelkey's avatar told Gabriel Paul Horcasitas. The AI version of Pelkey went on to encourage people to make the most of each day and to love each other, not knowing how much time one might have left. While use of AI within the court system is expanding, it's typically been reserved for administrative tasks, legal research and case preparation. In Arizona, it's helped inform the public of rulings in significant cases. But using AI to generate victim impact statements marks a new -- and legal, at least in Arizona -- tool for sharing information with the court outside the evidentiary phases. Maricopa County Superior Court Judge Todd Lang, who presided over the case, said after watching the video that he imagined Pelkey, who was 37 at the time of his killing, would have felt that way after learning about him. Lang also noted the video said something about Pelkey's family, who had expressed their anger over his death and had asked for Horcasitas to receive the maximum sentence. "Even though that's what you wanted, you allowed Chris to speak from his heart as you saw it," Lang said. Horcasitas, 54, was convicted of manslaughter and sentenced to 10.5 years in prison. Horcasitas' lawyer, Jason Lamm, told The Associated Press they filed a notice to appeal his sentence within hours of the hearing. Lamm said it's likely the appeals court will weigh whether the judge improperly relied on the AI video when handing down the sentence. The shooting happened the afternoon of Nov. 13, 2021, as both drivers were stopped at a red light. According to records, Pelkey was shot after getting out of his truck and walking toward Horcasitas' car. Pelkey's sister, Stacey Wales, raised the idea of her brother speaking for himself after struggling to figure out what she would say. She wrote a script for the AI-generated video, reflecting that he was a forgiving person. In Arizona, victims can give their impact statements in any digital format, said victims' rights attorney Jessica Gattuso, who represented the family. Wales, a software product consultant, took the AI idea to her husband, Tim. He and his friend, who have work experience creating humanlike AI avatars. Using a video clip of Pelkey, they aimed to replicate his voice and speech patterns. They generated Pelkey's likeness through a single image of him, digitally manipulating it to remove glasses and a hat logo, edit his outfit and trim his beard. Arizona Supreme Court Chief Justice Ann Timmer didn't address the road rage case specifically in an interview Wednesday. But she said the rise in popularity and accessibility to AI in recent years led to the creation of a committee to research best practices in the courts. Gary Marchant, a member of the committee and a law professor at Arizona State University, said he understands why Pelkey's family did it. But he warned the use of this technology could open the door to more people trying to introduce AI-generated evidence into courtrooms. "There's a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used," he said. "It's easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see." Marchant pointed to a recent case in New York, where a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen didn't exist at all. In the Arizona case, Wales said the AI-generated video worked because the judge had nearly 50 letters from family and friends that echoed the video's message. "Everybody knew that Chris would forgive this person," Wales said. ___ Yamat reported from Las Vegas. Associated Press reporter Susan Montoya Bryan in Albuquerque, New Mexico, contributed to this report.
[29]
Man speaks from the beyond the grave to his killer in a court room through AI
A deceased man has been digitally brough back to life to address his killer in a court room, with the goal being to showcase the victim's character. As an Amazon Associate, we earn from qualifying purchases. TweakTown may also earn commissions from other affiliate partners at no extra cost to you. Artificial intelligence has been used to bring a man killed in a road rage shooting back to life to address his killer at the sentencing hearing. Yes, this is real. Four years ago, Christopher Pelkey was shot dead in a road rage incident in Chandler, Arizona, but through the power of AI, Pelkey has been digitally recreated to make a statement in court during the trial, according to a report from local news site ABC 15. The video played in court shows the AI-generated Pelkey addressing the man who shot him dead, Gabriel Horcasitas, with Pelkey's digital recreation saying the following: "To Gabriel Horcasitas, the man who shot me - it is a shame we encountered each other that day in those circumstances. In another life, we probably could've been friends. I believe in forgiveness and in God who forgives. I always have and I still do." The AI video was accompanied by real videos of Pelkey to show a clearer understanding of his personality to the court. Notably, some of the real videos of Pelkey were used to create the AI-generated video. Judge Todd Lang responded to the AI-generated video, saying, "I love that AI. Thank you for that. I felt like that was genuine, that his obvious forgiveness of Mr. Horcasitas reflects the character I heard about today." The use of AI-generated victim statements in court proceedings adds a previously impossible element to the process, and while the videos can be touching/impactful to the case through a personal element, it also raises some pretty serious concerns about where the line is drawn on their use. The main concerns are authenticity, emotional influence, and appropriate application. It's likely courts will adopt some AI guidelines for the use of these types of videos and how they can influence the general proceedings.
[30]
He was killed in a road rage shooting. AI allowed him to deliver his own victim impact statement
CHANDLER, Ariz. (AP) -- There were dozens of statements submitted to the court by family and friends of Christopher Pelkey when it came time to sentence the man convicted of shooting him during a road rage incident. They provided glimpses of Pelkey's humor, his character and his military service. But there was nothing quite like hearing from the victim himself -- in this case, an AI-generated version. In what's believed to be a first in U.S. courts, Pelkey's family used artificial intelligence to create a video using his likeness to give him a voice. The AI rendering of Pelkey told the shooter during the sentencing hearing last week that it was a shame they had to meet that day in 2021 under those circumstances -- and that the two of them probably could have been friends in another life. "I believe in forgiveness and in God who forgives. I always have and I still do," Pelkey's avatar told Gabriel Paul Horcasitas. The AI version of Pelkey went on to share advice for people to make the most of each day and to love each other, not knowing how much time one might have left. While use of artificial intelligence within the court system is expanding, it's typically been reserved for administrative tasks, legal research, case preparation. In Arizona, it's helped inform the public of rulings in significant cases. Using AI to generate victim impact statements marks a new -- and legal, at least in Arizona -- tool for sharing information with the court outside the evidentiary phases. Maricopa County Superior Court Judge Todd Lang, who presided over the road rage case, said after watching the video that he imagined Pelkey, who was 37 at the time of his killing, would have felt that way after learning about him. Lang also noted the video said something about Pelkey's family, who had expressed their anger over his death and had asked for Horcasitas to receive the maximum sentence. Horcasitas was convicted of manslaughter and sentenced to 10.5 imprisonment. "Even though that's what you wanted, you allowed Chris to speak from his heart as you saw it," Lang said. The Associated Press left phone and emailed messages Wednesday seeking comment from Horcasitas' lawyer. Pelkey's sister, Stacey Wales, raised the idea of her brother speaking for himself. For years, while the case worked its way through the legal system, Wales said she thought about what she would say at the sentencing hearing. She struggled to get words down on paper. But when she thought about what her brother would say to the shooter, knowing he would have forgiven him, the words poured out of her. In Arizona, victims can give their impact statements in any digital format, said victims' rights attorney Jessica Gattuso, who represented the family. Arizona Supreme Court Justice Ann Timmer didn't address the road rage case specifically in an interview Wednesday. But she said the rise in popularity and accessibility to AI in recent years led to the formation of a committee to research best practices in the courts. Gary Marchant, a member of the committee and a law professor at Arizona State University, said he understands why Pelkey's family did it. But he warned the use of this technology could open the door to more people trying to introduce AI-generated evidence into courtrooms. "There's a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used," he said. "It's easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see." Marchant pointed to a recent case in New York, where a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges on the appeals court to realize that the man addressing them from the video screen didn't exist at all. In the Arizona case, Wales said the AI-generated video worked because the judge had nearly 50 letters from family and friends that echoed the video's message. "There was a solid gold thread through all of those stories -- that was the heart of Chris," Wales said. "This works because it talks about the kind of person Chris was." ___ Yamat reported from Las Vegas. Associated Press reporter Susan Montoya Bryan in Albuquerque, New Mexico, contributed to this report.
[31]
After a deadly shooting, AI allowed a man to deliver his own victim impact statement
CHANDLER, Ariz. -- There were dozens of statements submitted to the court by family and friends of Christopher Pelkey when it came time to sentence the man convicted of shooting him during a road rage incident. They provided glimpses of Pelkey's humor, his character and his military service. But there was nothing quite like hearing from the victim himself -- in this case, an AI-generated version. In what's believed to be a first in U.S. courts, Pelkey's family used artificial intelligence to create a video using his likeness to give him a voice. The AI rendering of Pelkey told the shooter during the sentencing hearing last week that it was a shame they had to meet that day in 2021 under those circumstances -- and that the two of them probably could have been friends in another life. "I believe in forgiveness and in God who forgives. I always have and I still do," Pelkey's avatar told Gabriel Paul Horcasitas. The AI version of Pelkey went on to share advice for people to make the most of each day and to love each other, not knowing how much time one might have left. While use of artificial intelligence within the court system is expanding, it's typically been reserved for administrative tasks, legal research, case preparation. In Arizona, it's helped inform the public of rulings in significant cases. Using AI to generate victim impact statements marks a new -- and legal, at least in Arizona -- tool for sharing information with the court outside the evidentiary phases. Maricopa County Superior Court Judge Todd Lang, who presided over the road rage case, said after watching the video that he imagined Pelkey, who was 37 at the time of his killing, would have felt that way after learning about him. Lang also noted the video said something about Pelkey's family, who had expressed their anger over his death and had asked for Horcasitas to receive the maximum sentence. Horcasitas was convicted of manslaughter and sentenced to 10.5 imprisonment. "Even though that's what you wanted, you allowed Chris to speak from his heart as you saw it," Lang said. The Associated Press left phone and emailed messages Wednesday seeking comment from Horcasitas' lawyer. Pelkey's sister, Stacey Wales, raised the idea of her brother speaking for himself. For years, while the case worked its way through the legal system, Wales said she thought about what she would say at the sentencing hearing. She struggled to get words down on paper. But when she thought about what her brother would say to the shooter, knowing he would have forgiven him, the words poured out of her. In Arizona, victims can give their impact statements in any digital format, said victims' rights attorney Jessica Gattuso, who represented the family. Arizona Supreme Court Justice Ann Timmer didn't address the road rage case specifically in an interview Wednesday. But she said the rise in popularity and accessibility to AI in recent years led to the formation of a committee to research best practices in the courts. Gary Marchant, a member of the committee and a law professor at Arizona State University, said he understands why Pelkey's family did it. But he warned the use of this technology could open the door to more people trying to introduce AI-generated evidence into courtrooms. "There's a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used," he said. "It's easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see." Marchant pointed to a recent case in New York, where a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges on the appeals court to realize that the man addressing them from the video screen didn't exist at all. In the Arizona case, Wales said the AI-generated video worked because the judge had nearly 50 letters from family and friends that echoed the video's message. "There was a solid gold thread through all of those stories -- that was the heart of Chris," Wales said. "This works because it talks about the kind of person Chris was." ___ Yamat reported from Las Vegas. Associated Press reporter Susan Montoya Bryan in Albuquerque, New Mexico, contributed to this report.
[32]
From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system
Stacey Wales gripped the lectern, choking back tears as she asked the judge to give the man who shot and killed her brother the maximum possible sentence for manslaughter. What appeared next stunned those in the Phoenix courtroom last week: An AI-generated video with a likeness of her brother, Christopher Pelkey, told the shooter he was forgiven. The judge said he loved and appreciated the video, then sentenced the shooter to 10.5 years in prison -- the maximum sentence and more than what prosecutors sought. Within hours of the hearing on May 1, the defendant's lawyer filed a notice of appeal. Defense attorney Jason Lamm won't be handling the appeal, but said a higher court will likely be asked to weigh in on whether the judge improperly relied on the AI-generated video when sentencing his client. Courts across the country have been grappling with how to best handle the increasing presence of artificial intelligence in the courtroom. Even before Pelkey's family used AI to give him a voice for the victim impact portion -- believed to be a first in U.S. courts -- the Arizona Supreme Court created a committee that researches best AI practices. In Florida, a judge recently donned a virtual reality headset meant to show the point of view of a defendant who said he was acting in self-defense when he waved a loaded gun at wedding guests. The judge rejected his claim. And in New York, a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen wasn't real. Experts say using AI in courtrooms raises legal and ethical concerns, especially if it's used effectively to sway a judge or jury. And they argue it could have a disproportionate impact on marginalized communities facing prosecution. "I imagine that will be a contested form of evidence, in part because it could be something that advantages parties that have more resources over parties that don't," said David Evan Harris, an expert on AI deep fakes at UC Berkeley's business school. AI can be very persuasive, Harris said, and scholars are studying the intersection of the technology and manipulation tactics. Cynthia Godsoe, a law professor at Brooklyn Law School and a former public defender, said as this technology continues to push the boundaries of traditional legal practices, courts will have to confront questions they have never before had to weigh: Does this AI photograph really match the witness's testimony? Does this video exaggerate the suspect's height, weight, or skin color? "It's definitely a disturbing trend," she said, "because it could veer even more into fake evidence that maybe people don't figure out is false." In the Arizona case, the victim's sister told The Associated Press that she did consider the "ethics and morals" of writing a script and using her brother's likeness to give him a voice during the sentencing hearing. "It was important to us to approach this with ethics and morals and to not use it to say things that Chris wouldn't say or believe," Stacey Wales said. Victims can give their impact statements in any digital format in Arizona, said victims' rights attorney Jessica Gattuso, who represented the family. When the video played in the courtroom, Wales said only she and her husband knew about it. "The goal was to humanize Chris and to reach the judge," Wales said. After viewing it, Maricopa County Superior Court Judge Todd Lang said he "loved the beauty in what Christopher" said in the AI video. "It also says something about the family," he said. "Because you told me how angry you were, and you demanded the maximum sentence, and even though that's what you wanted, you allowed Chris to speak from his heart as you saw it." On appeal, the defendant's lawyer said, the judge's comments could be a factor for the sentence to be overturned. ___ Associated Press reporters Sarah Parvini in Los Angeles, Sejal Govindarao in Phoenix and Kate Payne in Tallahassee, Florida, contributed to this report.
[33]
From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system
Courts across the country have been grappling with how to deal with the increasing presence of artificial intelligence in the courtroom Stacey Wales gripped the lectern, choking back tears as she asked the judge to give the man who shot and killed her brother the maximum possible sentence for manslaughter. What appeared next stunned those in the Phoenix courtroom last week: An AI-generated video with a likeness of her brother, Christopher Pelkey, told the shooter he was forgiven. The judge said he loved and appreciated the video, then sentenced the shooter to 10.5 years in prison -- the maximum sentence and more than what prosecutors sought. Within hours of the hearing on May 1, the defendant's lawyer filed a notice of appeal. Defense attorney Jason Lamm won't be handling the appeal, but said a higher court will likely be asked to weigh in on whether the judge improperly relied on the AI-generated video when sentencing his client. Courts across the country have been grappling with how to best handle the increasing presence of artificial intelligence in the courtroom. Even before Pelkey's family used AI to give him a voice for the victim impact portion -- believed to be a first in U.S. courts -- the Arizona Supreme Court created a committee that researches best AI practices. In Florida, a judge recently donned a virtual reality headset meant to show the point of view of a defendant who said he was acting in self-defense when he waved a loaded gun at wedding guests. The judge rejected his claim. And in New York, a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen wasn't real. Experts say using AI in courtrooms raises legal and ethical concerns, especially if it's used effectively to sway a judge or jury. And they argue it could have a disproportionate impact on marginalized communities facing prosecution. "I imagine that will be a contested form of evidence, in part because it could be something that advantages parties that have more resources over parties that don't," said David Evan Harris, an expert on AI deep fakes at UC Berkeley's business school. AI can be very persuasive, Harris said, and scholars are studying the intersection of the technology and manipulation tactics. Cynthia Godsoe, a law professor at Brooklyn Law School and a former public defender, said as this technology continues to push the boundaries of traditional legal practices, courts will have to confront questions they have never before had to weigh: Does this AI photograph really match the witness's testimony? Does this video exaggerate the suspect's height, weight, or skin color? "It's definitely a disturbing trend," she said, "because it could veer even more into fake evidence that maybe people don't figure out is false." In the Arizona case, the victim's sister told The Associated Press that she did consider the "ethics and morals" of writing a script and using her brother's likeness to give him a voice during the sentencing hearing. "It was important to us to approach this with ethics and morals and to not use it to say things that Chris wouldn't say or believe," Stacey Wales said. Victims can give their impact statements in any digital format in Arizona, said victims' rights attorney Jessica Gattuso, who represented the family. When the video played in the courtroom, Wales said only she and her husband knew about it. "The goal was to humanize Chris and to reach the judge," Wales said. After viewing it, Maricopa County Superior Court Judge Todd Lang said he "loved the beauty in what Christopher" said in the AI video. "It also says something about the family," he said. "Because you told me how angry you were, and you demanded the maximum sentence, and even though that's what you wanted, you allowed Chris to speak from his heart as you saw it." On appeal, the defendant's lawyer said, the judge's comments could be a factor for the sentence to be overturned. ___ Associated Press reporters Sarah Parvini in Los Angeles, Sejal Govindarao in Phoenix and Kate Payne in Tallahassee, Florida, contributed to this report.
[34]
Family Creates AI Video to Depict Arizona Man Addressing His Killer in Court
CHANDLER, ARIZONA (Reuters) -A simulation of a dead man created by artificial intelligence addressed his killer in an Arizona court this month, in what appears to be one of the first such instances in a U.S. courtroom. Made by his family, an AI-generated avatar of Christopher Pelkey spoke in Maricopa County Superior Court on May 1, as a judge prepared to sentence Gabriel Paul Horcasitas for shooting and killing Pelkey in a 2021 road-rage incident. "It is a shame we encountered each other that day in those circumstances," the Pelkey avatar says in the video. "In another life, we probably could have been friends." The Pelkey avatar appears in the video sporting a long beard and green sweatshirt against a white backdrop. He cautions at the start that he is an AI-version of Pelkey, which is apparent through the gaps in audio and slightly mismatched movement of his mouth. Pelkey, a U.S. Army veteran, was 37 at the time of the shooting. The video marked a novel use of AI in the legal system, which has viewed the rapidly growing technology with a mix of fascination and trepidation. Courts generally have strict rules on the types of information that can be presented in legal proceedings, and several lawyers have been sanctioned after AI systems created fake cases that they cited in legal briefs. Pelkey's relatives were given more leeway to present the AI-generated video to the judge at sentencing, given that it was not evidence in the case. Horcasitas, who was sentenced to 10.5 years in state prison, had already been convicted on manslaughter and endangerment charges. Pelkey's sister Stacey Wales said she scripted the AI-generated message after struggling to convey years of grief and pain in her own statement. She said she was not ready to forgive Horcasitas, but felt her brother would have a more understanding outlook. "The goal was to humanize Chris, to reach the judge, and let him know his impact on this world and that he existed," she told Reuters. Generative AI, Wales said, is "just another avenue that you can use to reach somebody." Wales said she worked with her husband and a family friend, who all work in the tech industry, to create it. Harry Surden, a law professor at the University of Colorado, said the use of generative AI material in court raises ethical concerns, as others may seek to use those tools to play on the emotions of judges and juries. The content is a simulation of reality, not the verified evidence that courts typically assess, Surden said. "What we're seeing is the simulations have gotten so good that it completely bypasses our natural skepticism and goes straight to our emotion," he said. (Reporting by Liliana Salgado in Chandler, Arizona and Andrew Goudsward in Washington; editing by Andy Sullivan and Aurora Ellis)
[35]
How an AI-Generated Video Gave a Victim a Voice at His Killer's Sentencing
There were dozens of statements submitted to the court by family and friends of Christopher Pelkey when it came time to sentence the man convicted of fatally shooting him during a road rage incident. They provided glimpses of Pelkey's humor, his character and his military service. But there was nothing quite like hearing from the victim himself -- even if it was a version generated by artificial intelligence. In what's believed to be a first in U.S. courts, Pelkey's family used AI to create a video using his likeness to give him a voice. The AI rendering of Pelkey told the shooter during the sentencing hearing last week in Phoenix that it was a shame they had to meet that day in 2021 under those circumstances -- and that in another life, the two of them probably could have been friends. "I believe in forgiveness and in God who forgives. I always have and I still do," Pelkey's avatar told Gabriel Paul Horcasitas.
[36]
He Was Killed in a Road Rage Shooting. AI Allowed Him to Deliver His Own Victim Impact Statement
This undated image shows Chris Pelkey with a fish at Crystal Lake in Coram, N.Y. (Tim Wales via AP) CHANDLER, Ariz. (AP) -- There were dozens of statements submitted to the court by family and friends of Christopher Pelkey when it came time to sentence the man convicted of shooting him during a road rage incident. They provided glimpses of Pelkey's humor, his character and his military service. But there was nothing quite like hearing from the victim himself -- in this case, an AI-generated version. In what's believed to be a first in U.S. courts, Pelkey's family used artificial intelligence to create a video using his likeness to give him a voice. The AI rendering of Pelkey told the shooter during the sentencing hearing last week that it was a shame they had to meet that day in 2021 under those circumstances -- and that the two of them probably could have been friends in another life. "I believe in forgiveness and in God who forgives. I always have and I still do," Pelkey's avatar told Gabriel Paul Horcasitas. The AI version of Pelkey went on to share advice for people to make the most of each day and to love each other, not knowing how much time one might have left. While use of artificial intelligence within the court system is expanding, it's typically been reserved for administrative tasks, legal research, case preparation. In Arizona, it's helped inform the public of rulings in significant cases. Using AI to generate victim impact statements marks a new -- and legal, at least in Arizona -- tool for sharing information with the court outside the evidentiary phases. Maricopa County Superior Court Judge Todd Lang, who presided over the road rage case, said after watching the video that he imagined Pelkey, who was 37 at the time of his killing, would have felt that way after learning about him. Lang also noted the video said something about Pelkey's family, who had expressed their anger over his death and had asked for Horcasitas to receive the maximum sentence. Horcasitas was convicted of manslaughter and sentenced to 10.5 imprisonment. "Even though that's what you wanted, you allowed Chris to speak from his heart as you saw it," Lang said. The Associated Press left phone and emailed messages Wednesday seeking comment from Horcasitas' lawyer. Pelkey's sister, Stacey Wales, raised the idea of her brother speaking for himself. For years, while the case worked its way through the legal system, Wales said she thought about what she would say at the sentencing hearing. She struggled to get words down on paper. But when she thought about what her brother would say to the shooter, knowing he would have forgiven him, the words poured out of her. In Arizona, victims can give their impact statements in any digital format, said victims' rights attorney Jessica Gattuso, who represented the family. Arizona Supreme Court Justice Ann Timmer didn't address the road rage case specifically in an interview Wednesday. But she said the rise in popularity and accessibility to AI in recent years led to the formation of a committee to research best practices in the courts. Gary Marchant, a member of the committee and a law professor at Arizona State University, said he understands why Pelkey's family did it. But he warned the use of this technology could open the door to more people trying to introduce AI-generated evidence into courtrooms. "There's a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used," he said. "It's easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see." Marchant pointed to a recent case in New York, where a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges on the appeals court to realize that the man addressing them from the video screen didn't exist at all. In the Arizona case, Wales said the AI-generated video worked because the judge had nearly 50 letters from family and friends that echoed the video's message. "There was a solid gold thread through all of those stories -- that was the heart of Chris," Wales said. "This works because it talks about the kind of person Chris was." ___ Yamat reported from Las Vegas. Associated Press reporter Susan Montoya Bryan in Albuquerque, New Mexico, contributed to this report. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[37]
From AI Avatars to Virtual Reality Crime Scenes, Courts Are Grappling With AI in the Justice System
Stacey Wales gripped the lectern, choking back tears as she asked the judge to give the man who shot and killed her brother the maximum possible sentence for manslaughter. What appeared next stunned those in the Phoenix courtroom last week: An AI-generated video with a likeness of her brother, Christopher Pelkey, told the shooter he was forgiven. The judge said he loved and appreciated the video, then sentenced the shooter to 10.5 years in prison -- the maximum sentence and more than what prosecutors sought. Within hours of the hearing on May 1, the defendant's lawyer filed a notice of appeal. Defense attorney Jason Lamm won't be handling the appeal, but said a higher court will likely be asked to weigh in on whether the judge improperly relied on the AI-generated video when sentencing his client. Courts across the country have been grappling with how to best handle the increasing presence of artificial intelligence in the courtroom. Even before Pelkey's family used AI to give him a voice for the victim impact portion -- believed to be a first in U.S. courts -- the Arizona Supreme Court created a committee that researches best AI practices. In Florida, a judge recently donned a virtual reality headset meant to show the point of view of a defendant who said he was acting in self-defense when he waved a loaded gun at wedding guests. The judge rejected his claim. And in New York, a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen wasn't real. Experts say using AI in courtrooms raises legal and ethical concerns, especially if it's used effectively to sway a judge or jury. And they argue it could have a disproportionate impact on marginalized communities facing prosecution. "I imagine that will be a contested form of evidence, in part because it could be something that advantages parties that have more resources over parties that don't," said David Evan Harris, an expert on AI deep fakes at UC Berkeley's business school. AI can be very persuasive, Harris said, and scholars are studying the intersection of the technology and manipulation tactics. Cynthia Godsoe, a law professor at Brooklyn Law School and a former public defender, said as this technology continues to push the boundaries of traditional legal practices, courts will have to confront questions they have never before had to weigh: Does this AI photograph really match the witness's testimony? Does this video exaggerate the suspect's height, weight, or skin color? "It's definitely a disturbing trend," she said, "because it could veer even more into fake evidence that maybe people don't figure out is false." In the Arizona case, the victim's sister told The Associated Press that she did consider the "ethics and morals" of writing a script and using her brother's likeness to give him a voice during the sentencing hearing. "It was important to us to approach this with ethics and morals and to not use it to say things that Chris wouldn't say or believe," Stacey Wales said. Victims can give their impact statements in any digital format in Arizona, said victims' rights attorney Jessica Gattuso, who represented the family. When the video played in the courtroom, Wales said only she and her husband knew about it. "The goal was to humanize Chris and to reach the judge," Wales said. After viewing it, Maricopa County Superior Court Judge Todd Lang said he "loved the beauty in what Christopher" said in the AI video. "It also says something about the family," he said. "Because you told me how angry you were, and you demanded the maximum sentence, and even though that's what you wanted, you allowed Chris to speak from his heart as you saw it." On appeal, the defendant's lawyer said, the judge's comments could be a factor for the sentence to be overturned. ___ Associated Press reporters Sarah Parvini in Los Angeles, Sejal Govindarao in Phoenix and Kate Payne in Tallahassee, Florida, contributed to this report. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[38]
Dead man's AI replica speaks to shooter in court: 'We probably could have been friends'
Christopher Pelkey was killed in a road rage incident in 2021. His family created an AI-generated version of him to speak at his killer's sentencing. "It is a shame we encountered each other that day," a voice from the grave echoed throughout a Phoenix courtroom. "In another life, we probably could have been friends." The voice sounded a lot like Chris Pelkey, a 37-year-old U.S. Army veteran fatally shot more than three years ago during a road-rage attack in Chandler, Arizona. But it wasn't actually Pelkey. What onlookers heard on May 1 in a Maricopa County courtroom was a replica made using artificial intelligence, and both images and voice recordings of Pelkey, his older sister, Stacey Wales, told USA TODAY. The AI replica's voice rang out during a sentencing hearing for the man who shot Pelkey in November 2021: 54-year-old Gabriel Paul Horcasitas. Pelkey's voice forgave Horcasitas: "I believe in forgiveness and in God. I always have, and I still do." Prosecutors charged Horcasitas with murder and endangerment because he also shot a nearby vehicle with a woman and two children inside, according to a probable cause statement obtained by USA TODAY. None of them were hurt. Back in 2023, Horcasitas was found not guilty on the murder charge and guilty on a lesser charge of manslaughter, his lawyer, Jason Lamm, told USA TODAY on May 9. A new trial was ordered due to a prosecutor failing to disclose exculpatory evidence, and the new trial began in March 2025, Lamm said. Horcasitas pleaded guilty to reckless endangerment and was also found guilty of manslaughter, Lamm said. He was sentenced to 10 1/2 years and will serve 85% of that, Lamm said. His team has filed a notice of appeal. 'A true representation of the man we knew' Recently, Pelkey's family worked with Jessica Gattuso of Arizona Voice for Crime Victims, which provides pro bono legal representation, social services and training. Gattuso encouraged the family to humanize Pelkey, and gathered about 50 letters of support, including family, friends, elementary school teachers and military buddies, Wales said. Wales also said a video of the shooting that took her brother's life was shown in court. "The only thing the judge knew about my brother during trial was the video of him being blown away on the street from closed circuit TV," she said, adding that the judge also saw his autopsy photo. "That doesn't convey a lot of humanity." As Wales sat down to work on her own statement a week before the sentencing, she found it hard to summarize her brother, who impacted so many people, she said. She wondered what he would say, and she knew her brother would forgive his shooter, despite not forgiving the shooter herself. "I was able to write what he would say or what I thought he would say in five minutes," she said. Her husband and a colleague who work in tech helped her create the replica, she said. When it was completed, her husband was moved to tears. "This really felt like a true representation of the man we knew," Wales said. "We had one goal, which was to humanize Chris and to make a judge feel, and I believe that we were successful in doing so." What happened that day? On Nov. 13, 2021, Wales said that Horcasitas cut her brother off before the deadly interaction. Her brother then drove up in front of Horcasitas' car. Witnesses said that Pelkey was at a red light when Horcasitas pulled up behind him in a Volkswagen. When Horcasitas honked his horn, Pelkey got out of his truck and approached the man's car, according to a court record. Pelkey raised both of his hands in the air as if he was saying, "What the heck?" the document said. Pelkey was walking toward Horcasitas's car when witnesses heard shots, the statement said. Pelkey then went back to his truck and collapsed. Horcasitas gave a witness a trauma kit so they could try and help Pelkey, then went back to his vehicle with his hands in the air and his gun in its holster, the document said. When police interviewed Horcasitas, he said he honked his horn at Pelkey after pulling up behind his truck "as a friendly gesture." He claimed Pelkey threatened to beat him up and had his fists clenched. Judge, family, lawyers react to impact statement Maricopa County Superior Court Judge Todd Lang, who presided over the case, said he "loved that AI" and its message of forgiveness. He said hearing the AI replica forgive Horcasitas spoke to Pelkey's character, as well as his family's character. Speaking to the family, Lang said he recalled some loved ones asking for the maximum sentence. "Even though that's what you wanted, you allowed Chris to speak from his heart as you saw it," Lang told Pelkey's family. "I didn't hear him ask for the maximum sentence." Lamm said that once the defense team reviews transcripts, the use of AI will likely be brought up to the appellate court. His initial reaction to the replica was shock and he found it a bit "cringey to reincarnate someone and then put words in their mouth." "While victims have an absolute right to address the court at sentencing, my view is that this crossed the line of not only ethics and morals, but good taste," Lamm said. He added that during the incident in November 2021, witnesses saw Pelkey behaving aggressively and he allegedly said: "Do you want a piece of me?" "Those were his last words, not simply what was said on the AI," Lamm said. Wales said that during sentencing, the judge took into account previous letters of support for her brother, as well as the AI replica. "This was only the capstone in 49 other voices that encompassed his life," she said. Wales said her brother was generous, and served three tours in Afghanistan and Iraq while in the U.S. Army. His AI replica was based on real thoughts and feelings he had, she said. On May 1, the voice pushed those listening to love each other and live fully because time is short. "Embrace it, and stop worrying about those wrinkles," the replica said. "I'm going to go fishing now. Love you all. See you on the other side." Saleen Martin is a reporter on USA TODAY's NOW team. She is from Norfolk, Virginia - the 757. Email her at sdmartin@usatoday.com.
[39]
A Bullet Killed Him. AI Brought Him Back to Life in Court
AI Art Trends Keep Going Viral. Illustrators Don't Want People to Just Roll Over Advancements in artificial intelligence aren't just entering creative industries -- they're making their way to the courtroom as well. An Arizona family used an AI-generated version of their deceased family member to give his own victim statement in court, local outlet ABC 15 reports. Chris Pelkey, 37, was an Army veteran and avid fisherman killed in an alleged road rage incident on Nov. 13, 2021. According to the Chandler Police Department, Pelkey died after being shot by Gabriel Horcasitas, 54, who police say exited his car after a driving confrontation and fired his gun. (Attorneys for the state and Horcasitas did not respond to Rolling Stone's request for comment.) Horcasitas was convicted of endangerment and manslaughter in a 2023 trial, but was recently sentenced to 10.5 years in prison after Pelkey's victim statement was played in court -- a one year increase from the state's recommended 9.5 years, according to ABC15. Pelkey's sister, Stacey Wales, told Fox 10 that she was inspired to use AI to recreate Pelkey's likeness after trying to collect victim impact statements for the trial. AI was used to make the image sound like Pelkey, but she wrote the script. "We received 49 letters that the judge was able to read before walking into sentencing that day. But there was one missing piece. There was one voice that was not in those letters," she said. "But it was important not to make Chris say what I was feeling and to detach and let him speak because he said things that would never come out of my mouth, but I know would come out his." The impact statement used a picture of Pelkey and his voice profile to recreate a version of him. This video was interspersed with real video footage and photos of Pelkey, but also had him directly address people in the courtroom. "In another life, we probably could've been friends. I believe in forgiveness and in God who forgives. I always have and I still do," the AI said. While the AI-assisted image was only used in the victim statement, its inclusion in the courtroom brings up interesting questions about how artificial intelligence could help -- or hurt -- in matters of law. This appears to be the first time artificial intelligence has been used to present a victim impact statement in a U.S. courtroom. The Arizona Supreme Court's Chief Justice Ann Timmer wasn't involved in the case. But she told ABC15 that "a measured approach is best" and that to respond to potential use in the future, the Arizona Supreme Court has formed a committee to make recommendations on the best way to use AI in the courtroom. "At bottom, those who use AI -- including courts -- are responsible for its accuracy," she said.
[40]
AI Lets Man Killed In Road Rage Shooting Deliver Own Victim Impact Statement
CHANDLER, Ariz. (AP) -- There were dozens of statements submitted to the court by family and friends of Christopher Pelkey when it came time to sentence the man convicted of shooting him during a road rage incident. They provided glimpses of Pelkey's humor, his character and his military service. But there was nothing quite like hearing from the victim himself -- in this case, an AI-generated version. In what's believed to be a first in U.S. courts, Pelkey's family used artificial intelligence to create a video using his likeness to give him a voice. The AI rendering of Pelkey told the shooter during the sentencing hearing last week that it was a shame they had to meet that day in 2021 under those circumstances -- and that the two of them probably could have been friends in another life. "I believe in forgiveness and in God who forgives. I always have and I still do," Pelkey's avatar told Gabriel Paul Horcasitas. The AI version of Pelkey went on to share advice for people to make the most of each day and to love each other, not knowing how much time one might have left. While use of artificial intelligence within the court system is expanding, it's typically been reserved for administrative tasks, legal research and case preparation. In Arizona, it's helped inform the public of rulings in significant cases. Using AI to generate victim impact statements marks a new -- and legal, at least in Arizona -- tool for sharing information with the court outside the evidentiary phases. Maricopa County Superior Court Judge Todd Lang, who presided over the road rage case, said after watching the video that he imagined Pelkey, who was 37 at the time of his killing, would have felt that way after learning about him. Lang also noted the video said something about Pelkey's family, who had expressed their anger over his death and had asked for Horcasitas to receive the maximum sentence. Horcasitas, 54, was convicted of manslaughter and sentenced to 10.5 years in prison. "Even though that's what you wanted, you allowed Chris to speak from his heart as you saw it," Lang said. The Associated Press left phone and email messages Wednesday seeking comment from Horcasitas' lawyer. The shooting happened the afternoon of Nov. 13, 2021, as both drivers were stopped at a red light. According to records, Pelkey was shot after getting out of his truck and walking back toward Horcasitas' car. Pelkey's sister, Stacey Wales, raised the idea of her brother speaking for himself. For years, while the case worked its way through the legal system, Wales said she thought about what she would say at the sentencing hearing. She struggled to get words down on paper. But when she thought about what her brother would say to the shooter, knowing he would have forgiven him, the words poured out of her. In Arizona, victims can give their impact statements in any digital format, said victims' rights attorney Jessica Gattuso, who represented the family. Arizona Supreme Court Justice Ann Timmer didn't address the road rage case specifically in an interview Wednesday. But she said the rise in popularity and accessibility to AI in recent years led to the formation of a committee to research best practices in the courts. Gary Marchant, a member of the committee and a law professor at Arizona State University, said he understands why Pelkey's family did it. But he warned the use of this technology could open the door to more people trying to introduce AI-generated evidence into courtrooms. "There's a real concern among the judiciary and among lawyers that deepfake evidence will be increasingly used," he said. "It's easy to create it and anyone can do it on a phone, and it could be incredibly influential because judges and juries, just like all of us, are used to believing what you see." Marchant pointed to a recent case in New York, where a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen didn't exist at all. In the Arizona case, Wales said the AI-generated video worked because the judge had nearly 50 letters from family and friends that echoed the video's message. "There was a solid gold thread through all of those stories -- that was the heart of Chris," Wales said. "This works because it talks about the kind of person Chris was." ___ Yamat reported from Las Vegas. Associated Press reporter Susan Montoya Bryan in Albuquerque, New Mexico, contributed to this report.
[41]
From AI avatars to virtual reality crime scenes, courts are grappling with AI in the justice system
In a first for US courts, a Phoenix family used an AI-generated video of a deceased victim to deliver a forgiving message during a sentencing hearing. The judge, moved by the video, imposed the maximum 10.5-year sentence. Legal experts warn the emotional power of AI in court may raise ethical issues and deepen inequalities, prompting future legal challenges.Stacey Wales gripped the lectern, choking back tears as she asked the judge to give the man who shot and killed her brother the maximum possible sentence for manslaughter. What appeared next stunned those in the Phoenix courtroom last week: An AI-generated video with a likeness of her brother, Christopher Pelkey, told the shooter he was forgiven. The judge said he loved and appreciated the video, then sentenced the shooter to 10.5 years in prison - the maximum sentence and more than what prosecutors sought. Within hours of the hearing on May 1, the defendant's lawyer filed a notice of appeal. Defense attorney Jason Lamm won't be handling the appeal, but said a higher court will likely be asked to weigh in on whether the judge improperly relied on the AI-generated video when sentencing his client. Courts across the country have been grappling with how to best handle the increasing presence of artificial intelligence in the courtroom. Even before Pelkey's family used AI to give him a voice for the victim impact portion - believed to be a first in U.S. courts - the Arizona Supreme Court created a committee that researches best AI practices. In Florida, a judge recently donned a virtual reality headset meant to show the point of view of a defendant who said he was acting in self-defense when he waved a loaded gun at wedding guests. The judge rejected his claim. And in New York, a man without a lawyer used an AI-generated avatar to argue his case in a lawsuit via video. It took only seconds for the judges to realize that the man addressing them from the video screen wasn't real. Experts say using AI in courtrooms raises legal and ethical concerns, especially if it's used effectively to sway a judge or jury. And they argue it could have a disproportionate impact on marginalized communities facing prosecution. "I imagine that will be a contested form of evidence, in part because it could be something that advantages parties that have more resources over parties that don't," said David Evan Harris, an expert on AI deep fakes at UC Berkeley's business school. AI can be very persuasive, Harris said, and scholars are studying the intersection of the technology and manipulation tactics. Cynthia Godsoe, a law professor at Brooklyn Law School and a former public defender, said as this technology continues to push the boundaries of traditional legal practices, courts will have to confront questions they have never before had to weigh: Does this AI photograph really match the witness's testimony? Does this video exaggerate the suspect's height, weight, or skin color? "It's definitely a disturbing trend," she said, "because it could veer even more into fake evidence that maybe people don't figure out is false." In the Arizona case, the victim's sister told The Associated Press that she did consider the "ethics and morals" of writing a script and using her brother's likeness to give him a voice during the sentencing hearing. "It was important to us to approach this with ethics and morals and to not use it to say things that Chris wouldn't say or believe," Stacey Wales said. Victims can give their impact statements in any digital format in Arizona, said victims' rights attorney Jessica Gattuso, who represented the family. When the video played in the courtroom, Wales said only she and her husband knew about it. "The goal was to humanize Chris and to reach the judge," Wales said. After viewing it, Maricopa County Superior Court Judge Todd Lang said he "loved the beauty in what Christopher" said in the AI video. "It also says something about the family," he said. "Because you told me how angry you were, and you demanded the maximum sentence, and even though that's what you wanted, you allowed Chris to speak from his heart as you saw it." On appeal, the defendant's lawyer said, the judge's comments could be a factor for the sentence to be overturned.
[42]
Man shot dead in road rage incident reincarnated through AI video to...
A man shot dead in a road rage incident nearly four years ago appeared in an Arizona courtroom Monday to forgive his killer from beyond the grave -- through an eerie AI video played by his family. A lifelike simulacrum of Christopher Pelkey -- who was gunned down by Gabriel Horcasitas in 2021 following a dispute in Chandler, Ariz. -- spoke to a court audience in what is believed to be the first use of artificial intelligence to deliver a victim impact statement, according to local reports. "To Gabriel Horcasitas, the man who shot me: it is a shame we encountered each other that day in those circumstances," the artificial rendition of Pelkey said to a packed courtroom. "In another life, we probably could have been friends." "I believe in forgiveness and God who forgives. I always have, and still do," an AI version of Pelkey said. Horcasitas, 50, was found guilty of manslaughter for shooting Pelkey, 37, to death when he approached his car during a road rage incident in 2021. Pelkey's digital resurrection, created by his family, wore a logoless gray baseball cap, an olive green zipper hoodie and a full, ruddy beard. The mouth of the AI victim didn't always align with the words he was speaking in the clip, but the video still had a powerful effect. Judge Todd Lang was deeply moved by the artificial recreation. "I love that AI," the clearly emotional Judge Lang said, and then proceeded to give the defendant 10-and-a-half years for his role in Pelkey's death - a full year more than the prosecutors asked for. The AI video also featured a "real" photograph Pelkey took when he was still alive that was then run through an "old age" filter. "This is the best I can ever give you of what I would have looked like if I got the chance to grow old," the artificial version of Pelkey said. "Remember, getting old is a gift that not everybody has, so embrace it and stop worrying about those wrinkles." Pelkey's sister wrote the script that the AI version of her late brother spoke, telling AZ Family she wanted to give him a voice in his own manslaughter case. "I said, 'I have to let him speak,' and I wrote what he would have said, and I said, 'That's pretty good, I'd like to hear that if I was the judge'," Stacey Wales told the outlet. "I want the world to know Chris existed," Wales added. "If one person hears his name or sees this footage and goes to his Facebook page or looks him up on YouTube, they will hear Chris's love." Arizona Chief Justice Ann Timmer said she is excited about the potential benefits of AI in the courtroom, but is worried about its lasting effects in a statement to ABC 15. "AI can also hinder or even upend justice if inappropriately used," noting that the court has pulled together an AI committee to make recommendations for how best to apply it in the courtroom.
[43]
Arizona Family Brings Slain Man Back To Life With AI To Address His Killer in Court
'In another life, we probably could have been friends,' the victim's avatar says. A slain Arizona man was resurrected in the form of an AI avatar to directly address his killer during his recent hearing for sentencing. The computer-generated avatar of Christopher Pelkey, who was slain in 2021 during a violet road rage incident, spoke before the court during a May 1 sentencing hearing for Gabriel Horcasitas, who was convicted of shooting Pelkey while both men were stopped at a red light, according to The Washington Post. "Just to be clear for everyone seeing this. I am a version of Chris Pelkey re-created through AI," the avatar said before thanking the judge and turning his attention to his killer. "I would like to make my own impact statement to Gabriel Horcasitas, the man who shot me," the AI version of Pelkey said. "It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God who forgives. I always have, and I still do." The video ended with Pelkey's avatar saying goodbye to his family and signing off after saying: "Well, I'm going fishing now." A jury had convicted the shooter of manslaughter in March for opening fire on Pelkey during an incident in November 2021. Pelkey had gotten out of his car to address Mr. Horcasitas as he repeatedly honked his horn waiting for the traffic light to turn green. The victim's statement was created by Pelkey's sister, Stacy Wales, who turned to AI after the victim's attorney handling the case said, "try to bring him to life," during preparations for the family to provide an impact statement. "I said to myself, 'Well, what if Chris could make his own impact statement?'" Ms. Wales to the Washington Post. Her husband, a tech entrepreneur who had used AI tools to animate photos and replicated voices in the past, suggested creating an avatar in her brother's likeness. She wrote the avatar speech herself, saying she based it on what she thought her brother would say to his killer if given the opportunity, despite wanting Mr. Horcasitas to receive the toughest sentence possible for his actions. "I thought it was very effective," Ms. Wales' victims' attorney Jessica Gattuso said to the Post. "It was appropriate. I didn't know what kind of objections we might get or pushback." The digital likeness was enough to move Maricopa County Superior Court Judge Todd Lang nearly to tears. "I love that, AI. Thank you for that," Judge Lang said to Pelkey's family in the court room after the video was played. "And as angry as you are, and justifiably angry as the family is, I heard the forgiveness and I know Mr. Horcasitas appreciated it, but so did I." "As I said, I like to think that I would go through that, I don't know that I would be as forgiving." Mr. Horcasitas was sentenced to a 10-and-a-half-year prison sentence.
[44]
Family creates AI video to depict Arizona man addressing his killer in court
CHANDLER, ARIZONA (Reuters) -A simulation of a dead man created by artificial intelligence addressed his killer in an Arizona court this month, in what appears to be one of the first such instances in a U.S. courtroom. Made by his family, an AI-generated avatar of Christopher Pelkey spoke in Maricopa County Superior Court on May 1, as a judge prepared to sentence Gabriel Paul Horcasitas for shooting and killing Pelkey in a 2021 road-rage incident. "It is a shame we encountered each other that day in those circumstances," the Pelkey avatar says in the video. "In another life, we probably could have been friends." The Pelkey avatar appears in the video sporting a long beard and green sweatshirt against a white backdrop. He cautions at the start that he is an AI-version of Pelkey, which is apparent through the gaps in audio and slightly mismatched movement of his mouth. Pelkey, a U.S. Army veteran, was 37 at the time of the shooting. The video marked a novel use of AI in the legal system, which has viewed the rapidly growing technology with a mix of fascination and trepidation. Courts generally have strict rules on the types of information that can be presented in legal proceedings, and several lawyers have been sanctioned after AI systems created fake cases that they cited in legal briefs. Pelkey's relatives were given more leeway to present the AI-generated video to the judge at sentencing, given that it was not evidence in the case. Horcasitas, who was sentenced to 10.5 years in state prison, had already been convicted on manslaughter and endangerment charges. Pelkey's sister Stacey Wales said she scripted the AI-generated message after struggling to convey years of grief and pain in her own statement. She said she was not ready to forgive Horcasitas, but felt her brother would have a more understanding outlook. "The goal was to humanize Chris, to reach the judge, and let him know his impact on this world and that he existed," she told Reuters. Generative AI, Wales said, is "just another avenue that you can use to reach somebody." Wales said she worked with her husband and a family friend, who all work in the tech industry, to create it. Harry Surden, a law professor at the University of Colorado, said the use of generative AI material in court raises ethical concerns, as others may seek to use those tools to play on the emotions of judges and juries. The content is a simulation of reality, not the verified evidence that courts typically assess, Surden said. "What we're seeing is the simulations have gotten so good that it completely bypasses our natural skepticism and goes straight to our emotion," he said. (Reporting by Liliana Salgado in Chandler, Arizona and Andrew Goudsward in Washington; editing by Andy Sullivan and Aurora Ellis)
Share
Share
Copy Link
In a groundbreaking case, an AI-generated video of a deceased victim delivered an impact statement during a sentencing hearing, raising ethical and legal questions about the use of artificial intelligence in courtrooms.
In a groundbreaking development, an AI-generated video of Christopher Pelkey, a victim of a fatal road rage incident, was presented during a sentencing hearing in an Arizona courtroom. This marks what appears to be the first known instance of a generative AI deepfake being used in a victim impact statement in the United States 123.
Christopher Pelkey, a 37-year-old Army veteran, was killed in a road rage shooting in 2021. During the sentencing of Gabriel Paul Horcasitas, convicted of manslaughter, Pelkey's family presented a brief video depicting an AI version of the deceased 12. The AI-generated "clone" addressed the convicted killer, expressing forgiveness and offering life advice 13.
Pelkey's family reportedly created the video by training an AI model on various clips of Pelkey, applying an "old age" filter to simulate his current appearance 1. In the video, the AI version of Pelkey, wearing a green hoodie and gray baseball cap, addressed Horcasitas directly 12. The avatar acknowledged its AI nature and delivered a message of potential friendship and forgiveness 23.
This novel use of AI in the legal system has sparked discussions about its ethical and legal implications:
Judge's Response: The judge, Todd Lang, expressed appreciation for the video and sentenced Horcasitas to 10.5 years in prison, the maximum sentence 13.
Legal Concerns: Defense attorney Jason Lamm suggested that the use of the AI-generated video could be grounds for appeal 45.
Ethical Debates: Experts like Gary Marchant from Arizona State University warn about the potential dangers of using such technology in courtrooms 5.
Procedural Questions: The case raises questions about the admissibility and impact of AI-generated content in legal proceedings 34.
This case is part of a growing trend of AI applications in legal settings:
Previous Incidents: There have been instances of lawyers using AI to draft legal filings, sometimes resulting in "hallucinated" case citations 1.
Judicial Response: Courts have punished attorneys for deceptive use of AI, but rules remain unclear 1.
Ongoing Discussions: The legal community, including Supreme Court Chief Justice John Roberts, is grappling with the potential benefits and drawbacks of AI in courtrooms 1.
Stacey Wales, Pelkey's sister, explained that the AI-generated message was an attempt to humanize her brother and convey his forgiving nature 23. She emphasized the careful consideration of ethics in creating the video 3.
This case sets a precedent that could influence future use of AI in legal proceedings. It raises important questions about the authenticity of evidence, emotional manipulation in court, and the potential for resource disparities to affect legal outcomes 345. As AI technology continues to advance, the legal system will need to adapt and establish clear guidelines for its use in courtrooms.
Reference
[1]
[3]
[4]
A father's disturbing discovery of his murdered daughter's AI chatbot on Character.AI platform sparks debate on ethical implications and consent in AI technology.
4 Sources
4 Sources
An AI entrepreneur's attempt to use an AI-generated avatar for legal representation in a New York court backfires, raising questions about the role of AI in legal proceedings and the boundaries of courtroom technology.
15 Sources
15 Sources
Netflix's docuseries "American Murder: Gabby Petito" uses AI to recreate the victim's voice, igniting controversy over the ethical implications of using such technology in true crime narratives.
3 Sources
3 Sources
A 74-year-old plaintiff's attempt to use an AI-generated lawyer avatar in a New York courtroom backfires, raising questions about the use of artificial intelligence in legal proceedings.
2 Sources
2 Sources
The Arizona Supreme Court introduces AI-generated avatars, Victoria and Daniel, to deliver news about court rulings, aiming to improve public understanding and trust in the judicial system.
8 Sources
8 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved