AI Avatar Testifies For Deceased Victim In US Court
In November 2021, in Chandler, Arizona, a tragic road rage altercation resulted in Chris Pelkey being shot and killed by Gabriel Horcasitas. Horcasitas was subsequently tried and convicted of reckless manslaughter.
During Horcasitas's sentencing, Pelkey’s family wished to deliver a “victim impact statement,” a personal account to the judge about who Pelkey was. Finding it difficult to capture their sentiments in words, they turned to a novel solution: an AI-generated avatar of Pelkey. This allowed him to seemingly “talk” directly to the judge, using his face and voice.
A video, captioned "In Arizona, a judge allowed an AI avatar of a deceased crime victim to “read” an impact statement", was presented, marking the first instance in a United States court, and likely worldwide, where an AI-generated victim delivered such a post-mortem statement.
How Was The AI Avatar Made And Received
The AI avatar was the creation of Pelkey’s sister, Stacey Wales, and her husband, Tim. Stacey wrote the script for the avatar, basing the words on what she believed Pelkey would have expressed, not on his actual prior statements. Details of this process were shared by Stacey Wales in an explanation of how she created the AI video of her brother.
The avatar was developed using voice samples from Pelkey's old videos and family photos, including one used at his funeral. In the video message, the AI Pelkey spoke of believing in forgiveness and “a God who forgives,” and suggested that in “another life,” he and Horcasitas might have been friends.
Judge Todd Lang, who permitted the AI statement, expressed that he “loved” the AI and felt the forgiveness conveyed was “genuine.” This reaction can be seen in a clip showing Judge Todd Lang’s response to the AI victim impact statement. Ultimately, Horcasitas received a sentence of ten-and-a-half years, aligning with the family's request and exceeding the prosecution's nine-year recommendation.
Could This Happen In Australia
Generally, it is unlikely that Australian courts would accept such technological advances in sentencing hearings. Court rules across Australian states and territories are quite similar and more restrictive. These rules allow victims or their families to read written statements, usually vetted by the prosecution. While victims may include drawings and photos with approval, the focus remains on written or directly spoken accounts.
A victim typically reads their own statement. If the victim is deceased, family members can make a statement about their own trauma and loss. Sometimes, the prosecutor reads the statement or submits it in writing to the judge. To date, no Australian court has allowed family members to speak directly for a deceased victim in this manner; family statements are usually confined to the harm they themselves have suffered.
Furthermore, victims may be cross-examined by defense counsel on their statement's content, a procedure that would be impossible with an AI avatar. Creating an AI avatar would also be a time-consuming and costly endeavor for prosecutors to manage and edit.
Compared to the US, Australian courts generally have less tolerance for dramatic readings or the use of audio-visual materials. In the US, victims often have more freedom to express emotions, delve into personal narratives, and even show videos of the deceased to provide the court with a fuller understanding of the victim as an individual. The use of an AI avatar, therefore, is not a significant departure from what is already permissible in many US courts.
Despite these allowances in the US, there is still concern that the emotional weight of a direct statement from an AI victim could be used to manipulate the court by putting words into the victim’s virtual mouth. As seen in the Arizona case, Judge Lang was clearly moved by the AI Pelkey's statement.
While changes to Australian law would be needed to specifically ban AI recordings, current sentencing practices are already so restrictive that they effectively preclude such technology. It appears Australia is still some way off from adopting practices like Arizona's, where an AI avatar of a deceased person can speak from “beyond the grave.”