
Chris Pelkey was shot dead in a rampage in Arizona, USA three years ago and recently “Back to Court” to fight his killer through artificial intelligence.
During Gabriel Horcasitas’ sentencing, the man was convicted of shooting Pelkey at a red light, whose family used artificial intelligence (AI) to issue a post-death victim impact statement.
The technology recreates Pelkey’s sound and similarity using past voice recordings, photos and videos. His sister Stacey Wales wrote these words for the AI-generated video, reflecting what she believes is forgiveness’s fraternity to say.
“It’s a pity for Gabriel Horcasitas, the man who shot me, that day we met each other,” said Perki, of the AI rendering in the court. “In another life, we might be friends. I believe in forgiveness, a forgiveness God. I have it all the time, and I still do it.”
Strange and unusual: “version” of AI comprehensive murder victims is used for sentencing:
Chris Pelkey was killed in a road shooting in Arizona three years ago.
But with the help of artificial intelligence, he returned to the victim’s death earlier this month in the murderer’s sentencing… pic.twitter.com/6xl1hvqwmy
– Thomas (@thomas984634784) May 8, 2025
The video showed during the verdict, which included an AI version of Pelkey, wearing a grey baseball cap. Judge Todd Lang, who presided over the case, responded positively to the use of AI in court. He was sentenced to 10 and a half years in prison for manslaughter.
“I love AI, thank you. Being angry like you, being angry like family, I heard forgiveness.” “I think that’s true.”
Paul Grimm, a retired federal judge and professor at Duke Law School, said he was not surprised by the use of AI in this case, noting that Arizona courts have used AI in other ways, such as simplifying the Supreme Court’s ruling for public understanding.
Since the video was shown during the sentencing period after the jury had already made a sentence, its use was considered acceptable.
However, some experts warn against using such technologies in future legal cases. Derek Leben, a professor of business ethics at Carnegie Mellon University, expressed concerns about ensuring that the statements generated by AI are always consistent with what the deceased really wants.
“If we had someone else moving forward, would we always be loyal to this person, what would the victim want in this case?” Leben asked.
However, Wales believes their use of AI respects her brother’s values.
“We solve this problem with morality and ethics because it’s a powerful tool,” she said. “Just like a hammer can be used to break windows or tear off walls, it can also be used as a tool for building a house, and that’s how we use this technology.”