So a virtual corpse puppet?
That should never be allowed in court. What a crock of shit.
It was a victim impact statement, not subject to the rules of evidence. The shooter had already been found guilty, and this was an impact statement from the victim’s sister, to sway how the shooter should be sentenced. The victim’s bill of rights says that victims should be allowed to choose the method in which they make an impact statement, and his sister chose the AI video.
I agree that it shouldn’t be admissible as evidence. But that’s not really what’s being discussed here, because it wasn’t being used as evidence. The shooter was already found guilty.
Looking at the downvotes, remember upvoting an article ≠ an endorsement of the shitty technology being discussed in the article.
We shit on the technology in the comments, and upvote it so more of us can read about it and shit on it.
Maybe they like the technology and that’s why they’re downvoting the story.
If I am murdered please don’t do this. I do not care if you feel like it will help you process the events
Thanks for sharing; I thought this was a fascinating read, especially since it ended on a positive note and not pure condemnation. It seems totally Black Mirror-esque, but I do wonder how many of the commentators here attacking it didn’t read the article. The family obviously didn’t make this decision lightly, given how much work it took to create it, and even the judge appreciated the novel approach. This is probably one of the best-case use scenarios relative to the abyss of unstoppable horror that awaits us.
Fascinating but also kind of creepy.
Perhaps; it seemed like they knew the decedent well enough to know that he would appreciate this, from everything that the article says. With that said, I also won’t be surprised if templates for wills or living trusts add a no-duplication statement by default over the coming years.
If my family hired an actor to impersonate me at my killer’s trial and give a prepared speech about how I felt about the situation it would be thrown out of court.
If my family hired a cartoonist or movie studio to create a moving scene with my face recreated by digital artists and a professional voice actor to talk about my forgiveness for my death, it would be thrown out of court.
That they used a generative program to do it and the Judge allowed the video to influence the sentence as if it were a statement by the deceased is deeply troubling.
Apparently, it was required to be allowed in that state:
Reading a bit more, during the sentencing phase in that state people making victim impact statements can choose their format for expression, and it’s entirely allowed to make statements about what other people would say. So the judge didn’t actually have grounds to deny it.
No jury during that phase, so it’s just the judge listening to free form requests in both directions.It’s gross, but the rules very much allow the sister to make a statement about what she believes her brother would have wanted to say, in whatever format she wanted.
From: https://sh.itjust.works/comment/18471175
influence the sentence
From what I’ve seen, to be fair, judges’ decisions have varied wildly regardless, sadly, and sentences should be more standardized. I wonder what it would’ve been otherwise.
Demon technology. Did we learn nothing from doom 2016 ?
Cliffnotes?
- Woman’s brother was killed in a road rage incident
- In preparing her victim impact statement for the court, she struggled to find a way to properly represent her brother’s voice
- Her husband works with AI and helped her generate a video of her brother for the victim impact statement
- The video was very well received and apparently true to her brother’s personality. Though she didn’t forgive the killer, she knew her brother would. So, in the AI video, “he” did.
- After all the real people made their statements to the judge, the video was played
- The judge loved it and thanked the woman
Appreciated – my apologies that I wasn’t clear. I was curious about the connection to “did we learn nothing from doom 2016” that the OP referenced.
For this ? The guy who was brought back through Ai was killed in a hit and run then they brought the ai version of him to court to give a statement from beyond the grave of sorts. I think it’s immoral as fuck but I’m sure I’ll get told why it’s actually not.
I was wondering what happened in “doom 2016”. And now I can’t tell if you’re summarizing the article or what happened in doom 2016.
So basically the uac was fucking around with technology and went to far in their pursuit and opened a portal to hell in an attempt to harness it as a power source. Then the game itself kicks off after everything goes wrong and all hell breaks lose.
How does that relate to videos of dead people speaking someone else’s words? The only reanimated people in Doom 2016 are the shambling zombies.
Technology isn’t inherently good or evil. It entirely depends on the person using it. In this case, it had a very positive impact on everybody involved.
To me this is the equivalent of taxidermying a person then using them as a puppet. Sure it might have a positive impact on some people but it’s immoral at best.
Is it reaaaalllly immoral if the kids just freakin’ love it though?
This. I don’t see how it’s any different from making an ‘ai video’ about a murder victim thanking his murderer for easing his pain, in order to ‘make people feel better’ after a rich perpretrator games the system and is acquitted via dubious means. It’s blatant manipulation.
What makes it immoral? Nobody was hurt in any way, physically, emotionally, or financially. They disclosed the use of AI before showing the video. It even helped the perpetrator get a smaller sentence (IMO prison as a concept is inhumane, so less prison time is morally right).
It just feels wrong man. I’m of the belief that we should let the dead rest in peace. Bringing them back through ai or other means fundamentally goes against that. Im also against taxidermy but that’s not the debate were having rn. This lands in that category for me. I’m neutral on ai broadly but this is where I draw the line.
“It just feels wrong” isn’t a valid basis for morality. Lots of people say the idea of someone being gay just feels wrong. Lots of people say people being non-Muslim just feels wrong.
Reminds me of the crime skeleton, shout out to anyone who knows what I’m talking about.
Who could forget truly an inventionbefore it’s time.
https://www.atlasobscura.com/articles/criminal-confession-skeleton-patent
It sounds like it was played after a sentencing was given? Would be kind of sketchy if not.
This was played before sentencing. It doesn’t say it here, but the article I read earlier today stated that because of this video, the judge issued a sentence greater than the maximum recommended by the State. If true, then it really calls into question the sentence itself and how impartial the judge was.
Oh - then that’s fucked up. Synthesizing some narrative to potentially coerce an outcome seems like a slippery slope. (Not necessarily saying that’s exactly what happened here.)
It appears this was a Victim impact statement.
A victim impact statement is a written or oral statement made as part of the judicial legal process, which allows crime victims the opportunity to speak during the sentencing of the convicted person or at subsequent parole hearings.
From the article (emphasizes mine):
But the use of AI for a victim impact statement appears novel, according to Maura Grossman, a professor at the University of Waterloo who has studied the applications of AI in criminal and civil cases. She added, that she did not see any major legal or ethical issues in Pelkey’s case.
"Because this is in front of a judge, not a jury, and because the video wasn’t submitted as evidence per se, its impact is more limited," she told NPR via email.
Ah yes, appeals to emotion, my favorite part of the judicial process.