Stacey Wales spent two years working at the victim’s impact statement, which planned to present in court after his brother was shot dead on the road in 2021. But even after all this time, Wales felt that the statement would not be enough to capture the humanity of his brother Christopher Pelkey and what he would like to say.
Therefore, Wales decided to let Pelkey make the statement himself – with the help of Artificial Intelligence (AI).
She and her husband created an AI -generated Pelkey video to be reproduced during the condemnation hearing of her killer earlier this month, with the recreation of Pelkey’s own voice, who reads a script that Wales wrote. In it, Pelkey’s was going to express his forgiveness to the shooter, something Wales said he knew that his brother would have done, but that she herself was not yet prepared to do.
“The only thing that continued to get into my head and I kept hearing was Chris and what he would say,” Wales tells CNN. “I had to distance myself very carefully to write this in the name of Chris, because what he was saying is not necessarily what I believe, but I know that’s what he would think.”
AI is playing an increasingly important role in legal and criminal proceedings, although it is believed that this is the first time AI has been used to recreate a victim who reads his own declaration of impact. Experts say the world will increasingly face ethical and practical Issues about the use of AI to reproduce deceased people – both within the courts and outside them – as technology becomes more human.
We have all heard the expression “see to believe, listen to believe,” begins by saying Paul Grimm, professor at the Faculty of Law at Duke University and former Judge of the Maryland District Court. “These types of technologies have a tremendous impact to persuade and influence, and we will always have to consider whether or not it is distorting the record that the jury or judge has to decide in a way that makes it an unfair advantage to one side or the other.”
Judge Todd Lang of the Superior Court of Maricopa County eventually sentenced Pelkey’s killer, Gabriel Paul Horcasitas, to ten and a half years in arrest for involuntary murder – although the state had only requested nine and a half years – and in total years, including a danger accusation.
“I love this idea. Thank you for that,” Lang said, according to a recording of the audience. “For much angry that they are and as well justified as the anger of the family, I heard the forgiveness.”
Pelkey’s story was previously reported by ABC15 Arizona.
Take pelkey to the hearing room
Pelkey was the youngest of three children, a veteran and, according to Wales, “the family member who forgave the most and was more sympathetic.” It was killed in November 2021 in Chandler, Arizona, with 37 years.
Pelkey autopsy photos and surveillance video that recorded the moment of his murder were shown during the trial. But after the jury considered Horcasites guilty of qualified homicide, Wales wanted the judge to see how Pelkey was when he was alive during the sentence hearing.
Wales and her husband, Tim Wales, work in technology – she said they had already created video replicas from former executive directors and founders to talk about business conferences – so they decided, in the weeks before the conviction hearing, trying to replicate Pelkey in the same way.

Christopher Pelkey’s family, including Stacey Wales (fourth on the left), poses for a photograph with a Pelkey portrait. Courtesy of Stacey Wales
The couple used several software platforms, based on photographs and an old video of Pelkey, to create the AI replica that was presented at the audience of May 1. The day before the conviction hearing, Wales called his lawyer, Jessica Gattuso, to obtain her authorization.
“I was worried, I thought we were going to receive an objection or some kind of reaction. I did the research I could, but I didn’t find anything because I have never heard of such a thing,” Gattuso tells CNN, adding that he was based on an Arizona law that gives victims to the way of making statements.
Like other AI videos that portray people, Pelkey’s recreation is a bit hesitant and strange and begins with a recognition that it was made of technology. But Wales says he believes the video captured his essence.
“It’s a pity we’ve found that day, in those circumstances,” says Pelkey’s Ai version in the video. “In another life, we could probably have been friends.”
Horcasites lawyer Jason Lamm states that the defense did not receive prior notice that AI would be used in a statement of impact of the victim. He adds: “It seems that the judge has given some weight to the AI video and this is a question that will probably be appealed.”
How AI is changing justice
Judges are increasingly confronted with decisions about AI’s role in the audience room – including the question of whether or not to play a role.
Last month, in a non -related case, in New York, an appeal judge quickly ended an attempt by a complainant to have an Avatar generated by defending his case, without first clarifying that he was not a real person. And last week, a federal judges panel implemented a rule that requires the evidence generated by to comply with the same reliability standards as the evidence of specialized human witnesses, according to a Reuters report.
AI advancement also raised questions about whether technology could replace human jobs in the legal field.
“[A IA] It’s not disappearing and we’ll see more cases of these, ”believes Grimm, who has not been involved in the Pelkey case. Judges tend to get a little nervous about this technology and so it is likely that we will initially see more“ no ”than“ sins ”.
Judges may be especially hesitant to enable AI -generated evidence or visual appeals to be presented to a jury, which, unlike a judge in a case of sentence, was not trained so as not to let the emotion dominate the facts of the case, Grimm points out. There are also doubts about whether AI can incorrectly represent a part in a process, for example, making it look more sympathetic.
Grimm suggested that, from now on, the defense lawyer must have the opportunity to visualize the content generated by AI and raise possible objections for a judge to analyze, before being shown in court.
Even Wales warned that technology should be used carefully.
“This was no evidence, the jury never saw it. It wasn’t even done before the sentence,” Wales justifies. “This is an opinion. And the judge was allowed to see a human being, who is no longer here, for what he was.”
At the end of the day, she said that replicating her brother with AI was a “healing” for her family. After the action was taken to court, the 14 -year -old said Wales: “Thank you so much for it. He needed to see and hear Uncle Chris once again.”
Hazel Tang, from CNN, contributed to this report.