
He was killed in a road rage incident. His family used AI to bring him to the courtroom to address his killer
CNN
Stacey Wales spent two years working on the victim impact statement she planned to give in court after her brother was shot to death in a 2021 road rage incident. But even after all that time, Wales felt her statement wouldn’t be enough to capture her brother Christopher Pelkey’s humanity and what he would’ve wanted to say.
Stacey Wales spent two years working on the victim impact statement she planned to give in court after her brother was shot to death in a 2021 road rage incident. But even after all that time, Wales felt her statement wouldn’t be enough to capture her brother Christopher Pelkey’s humanity and what he would’ve wanted to say. So, Wales decided to let Pelkey give the statement himself — with the help of artificial intelligence. She and her husband created an AI-generated video version of Pelkey to play during his killer’s sentencing hearing earlier this month that read, in a recreation of Pelkey’s own voice, a script that Wales wrote. In it, the AI version of Pelkey expressed forgiveness to the shooter, something Wales said she knew her brother would have done but she wasn’t ready to do herself just yet. “The only thing that kept entering my head that I kept hearing was Chris and what he would say,” Wales told CNN. “I had to very carefully detach myself in order to write this on behalf of Chris because what he was saying is not necessarily what I believe, but I know it’s what he would think.” AI is increasingly playing a role in legal and criminal justice processes, although this is believed to be the first time AI has been used to recreate a victim for their own impact statement. And experts say the world will increasingly have to grapple with ethical and practical questions about the use of AI to replicate deceased people — both inside courtrooms and beyond them — as the technology becomes more human-like. “We’ve all heard the expression, ‘seeing is believing, hearing is believing,’” said Paul Grimm, a Duke University School of Law professor and former district court judge in Maryland. “These kinds of technologies have tremendous impact to persuade and influence, and we’re always going to have to be balancing whether or not it is distorting the record upon which the jury or the judge has to decide in a way that makes it an unfair advantage for one side or the other.”

Former judges side with Anthropic and raise concerns about Pentagon’s use of supply chain risk label
Nearly 150 retired federal and state judges have filed an amicus brief on Tuesday supporting AI company Anthropic in its lawsuit against the Trump administration for designating it a “supply chain risk,” CNN has learned.

Traffic through the strait, normally the conduit for a fifth of global oil output, has been severely curtailed since the start of the Iran conflict. But Iran itself is shipping oil through the waterway in almost the same volumes as before the war, earning the cash needed to sustain its economy and war effort.











