For this ? The guy who was brought back through Ai was killed in a hit and run then they brought the ai version of him to court to give a statement from beyond the grave of sorts. I think it’s immoral as fuck but I’m sure I’ll get told why it’s actually not.
etchinghillside@reddthat.com
on 08 May 00:10
collapse
I was wondering what happened in “doom 2016”. And now I can’t tell if you’re summarizing the article or what happened in doom 2016.
So basically the uac was fucking around with technology and went to far in their pursuit and opened a portal to hell in an attempt to harness it as a power source. Then the game itself kicks off after everything goes wrong and all hell breaks lose.
Woman’s brother was killed in a road rage incident
In preparing her victim impact statement for the court, she struggled to find a way to properly represent her brother’s voice
Her husband works with AI and helped her generate a video of her brother for the victim impact statement
The video was very well received and apparently true to her brother’s personality. Though she didn’t forgive the killer, she knew her brother would. So, in the AI video, “he” did.
After all the real people made their statements to the judge, the video was played
The judge loved it and thanked the woman
etchinghillside@reddthat.com
on 08 May 00:15
nextcollapse
Appreciated – my apologies that I wasn’t clear. I was curious about the connection to “did we learn nothing from doom 2016” that the OP referenced.
Saik0Shinigami@lemmy.saik0.com
on 08 May 00:44
collapse
etchinghillside@reddthat.com
on 08 May 03:43
collapse
You’re killing me.
Saik0Shinigami@lemmy.saik0.com
on 08 May 03:52
collapse
Nah. That Doom Guys MO. Rip and tear until it’s done.
Saik0Shinigami@lemmy.saik0.com
on 08 May 00:43
collapse
In preparing her victim impact statement for the court, she struggled to find a way to properly represent her brother’s voice
Should clarify that the woman wrote the script. The AI just generated the voice and image. The AI read the woman’s script who wrote it in the tone of her brother putting aside her own feelings.
Technology isn’t inherently good or evil. It entirely depends on the person using it. In this case, it had a very positive impact on everybody involved.
To me this is the equivalent of taxidermying a person then using them as a puppet. Sure it might have a positive impact on some people but it’s immoral at best.
BumpingFuglies@lemmy.zip
on 08 May 00:28
nextcollapse
What makes it immoral? Nobody was hurt in any way, physically, emotionally, or financially. They disclosed the use of AI before showing the video. It even helped the perpetrator get a smaller sentence (IMO prison as a concept is inhumane, so less prison time is morally right).
It just feels wrong man. I’m of the belief that we should let the dead rest in peace. Bringing them back through ai or other means fundamentally goes against that. Im also against taxidermy but that’s not the debate were having rn. This lands in that category for me. I’m neutral on ai broadly but this is where I draw the line.
"It just feels wrong" isn't a valid basis for morality. Lots of people say the idea of someone being gay just feels wrong. Lots of people say people being non-Muslim just feels wrong.
Oh, I agree that it’s creepy and something that could very easily be abused. But in this case, it seems to have been the right move. Whether the dead brother would have approved, we’ll never know. But the living sister seemed to earnestly believe he would have, and that’s enough for me.
No doubt it’s weird, but it was also a genuine attempt by a sister to speak for her beloved brother. I think it’s beautiful and a perfect example of the importance of keeping an open mind, especially regarding things that make us uncomfortable.
Because a judge allowing anyone to represent their views in court as though those views belong to someone else is a textbook "bad idea." It is a misrepresentation of the truth.
Not at all, because it would have been her making claims about what she believes her brother would have said, and not a simulacrum of her brother speaking her words with his voice.
You can say that all you want, but when your brain is presented with a video of a person, using that person's voice, you're going to take what's being said as being from that person in the video.
True, many people would have that problem, which is why the context in which the video was shown was acceptable; it was after the verdict had been given.
Such a thing should not impact sentencing, either. The judge allowed it, the judge was swayed by it, it impacted sentencing. This is wrong.
JeeBaiChow@lemmy.world
on 08 May 00:29
nextcollapse
This. I don’t see how it’s any different from making an ‘ai video’ about a murder victim thanking his murderer for easing his pain, in order to ‘make people feel better’ after a rich perpretrator games the system and is acquitted via dubious means. It’s blatant manipulation.
A victim impact statement is a written or oral statement made as part of the judicial legal process, which allows crime victims the opportunity to speak during the sentencing of the convicted person or at subsequent parole hearings.
From the article (emphasizes mine):
But the use of AI for a victim impact statement appears novel, according to Maura Grossman, a professor at the University of Waterloo who has studied the applications of AI in criminal and civil cases. She added, that she did not see any major legal or ethical issues in Pelkey’s case.
"Because this is in front of a judge, not a jury, and because the video wasn’t submitted as evidence per se, its impact is more limited," she told NPR via email.
This was played before sentencing. It doesn’t say it here, but the article I read earlier today stated that because of this video, the judge issued a sentence greater than the maximum recommended by the State. If true, then it really calls into question the sentence itself and how impartial the judge was.
etchinghillside@reddthat.com
on 08 May 03:41
collapse
Oh - then that’s fucked up. Synthesizing some narrative to potentially coerce an outcome seems like a slippery slope. (Not necessarily saying that’s exactly what happened here.)
ragebutt@lemmy.dbzer0.com
on 08 May 00:36
nextcollapse
If I am murdered please don’t do this. I do not care if you feel like it will help you process the events
simplejack@lemmy.world
on 08 May 00:44
nextcollapse
Looking at the downvotes, remember upvoting an article ≠ an endorsement of the shitty technology being discussed in the article.
We shit on the technology in the comments, and upvote it so more of us can read about it and shit on it.
sem@lemmy.blahaj.zone
on 08 May 02:14
nextcollapse
Maybe they like the technology and that’s why they’re downvoting the story.
That should never be allowed in court. What a crock of shit.
mic_check_one_two@lemmy.dbzer0.com
on 08 May 21:11
collapse
It was a victim impact statement, not subject to the rules of evidence. The shooter had already been found guilty, and this was an impact statement from the victim’s sister, to sway how the shooter should be sentenced. The victim’s bill of rights says that victims should be allowed to choose the method in which they make an impact statement, and his sister chose the AI video.
I agree that it shouldn’t be admissible as evidence. But that’s not really what’s being discussed here, because it wasn’t being used as evidence. The shooter was already found guilty.
joshchandra@midwest.social
on 08 May 20:34
nextcollapse
Thanks for sharing; I thought this was a fascinating read, especially since it ended on a positive note and not pure condemnation. It seems totally Black Mirror-esque, but I do wonder how many of the commentators here attacking it didn’t read the article. The family obviously didn’t make this decision lightly, given how much work it took to create it, and even the judge appreciated the novel approach. This is probably one of the best-case use scenarios relative to the abyss of unstoppable horror that awaits us.
joshchandra@midwest.social
on 08 May 20:41
collapse
Perhaps; it seemed like they knew the decedent well enough to know that he would appreciate this, from everything that the article says. With that said, I also won’t be surprised if templates for wills or living trusts add a no-duplication statement by default over the coming years.
If my family hired an actor to impersonate me at my killer’s trial and give a prepared speech about how I felt about the situation it would be thrown out of court.
If my family hired a cartoonist or movie studio to create a moving scene with my face recreated by digital artists and a professional voice actor to talk about my forgiveness for my death, it would be thrown out of court.
That they used a generative program to do it and the Judge allowed the video to influence the sentence as if it were a statement by the deceased is deeply troubling.
joshchandra@midwest.social
on 09 May 12:45
collapse
Apparently, it was required to be allowed in that state:
Reading a bit more, during the sentencing phase in that state people making victim impact statements can choose their format for expression, and it’s entirely allowed to make statements about what other people would say. So the judge didn’t actually have grounds to deny it.
No jury during that phase, so it’s just the judge listening to free form requests in both directions.
It’s gross, but the rules very much allow the sister to make a statement about what she believes her brother would have wanted to say, in whatever format she wanted.
From what I’ve seen, to be fair, judges’ decisions have varied wildly regardless, sadly, and sentences should be more standardized. I wonder what it would’ve been otherwise.
threaded - newest
Demon technology. Did we learn nothing from doom 2016 ?
Cliffnotes?
For this ? The guy who was brought back through Ai was killed in a hit and run then they brought the ai version of him to court to give a statement from beyond the grave of sorts. I think it’s immoral as fuck but I’m sure I’ll get told why it’s actually not.
I was wondering what happened in “doom 2016”. And now I can’t tell if you’re summarizing the article or what happened in doom 2016.
So basically the uac was fucking around with technology and went to far in their pursuit and opened a portal to hell in an attempt to harness it as a power source. Then the game itself kicks off after everything goes wrong and all hell breaks lose.
How does that relate to videos of dead people speaking someone else’s words? The only reanimated people in Doom 2016 are the shambling zombies.
Appreciated – my apologies that I wasn’t clear. I was curious about the connection to “did we learn nothing from doom 2016” that the OP referenced.
Is a video game. en.wikipedia.org/wiki/Doom_(2016_video_game)
You’re killing me.
Nah. That Doom Guys MO. Rip and tear until it’s done.
Should clarify that the woman wrote the script. The AI just generated the voice and image. The AI read the woman’s script who wrote it in the tone of her brother putting aside her own feelings.
Technology isn’t inherently good or evil. It entirely depends on the person using it. In this case, it had a very positive impact on everybody involved.
To me this is the equivalent of taxidermying a person then using them as a puppet. Sure it might have a positive impact on some people but it’s immoral at best.
What makes it immoral? Nobody was hurt in any way, physically, emotionally, or financially. They disclosed the use of AI before showing the video. It even helped the perpetrator get a smaller sentence (IMO prison as a concept is inhumane, so less prison time is morally right).
It just feels wrong man. I’m of the belief that we should let the dead rest in peace. Bringing them back through ai or other means fundamentally goes against that. Im also against taxidermy but that’s not the debate were having rn. This lands in that category for me. I’m neutral on ai broadly but this is where I draw the line.
"It just feels wrong" isn't a valid basis for morality. Lots of people say the idea of someone being gay just feels wrong. Lots of people say people being non-Muslim just feels wrong.
That must be a touchy point for someone of your username
Oh, I agree that it’s creepy and something that could very easily be abused. But in this case, it seems to have been the right move. Whether the dead brother would have approved, we’ll never know. But the living sister seemed to earnestly believe he would have, and that’s enough for me.
Those were not his words. They were someone else's words spoken by a very realistic puppet they made of him after he died.
That's weird at best, and does not belong in a court.
No doubt it’s weird, but it was also a genuine attempt by a sister to speak for her beloved brother. I think it’s beautiful and a perfect example of the importance of keeping an open mind, especially regarding things that make us uncomfortable.
So we agree on one point, weirdness.
It’s still got no business in a courtroom.
Why not? It wasn’t used to influence the trial in any way; it was just part of the victim impact statements after the verdict was rendered.
Because a judge allowing anyone to represent their views in court as though those views belong to someone else is a textbook "bad idea." It is a misrepresentation of the truth.
So it would’ve been equally bad if instead of a video, she’d just read a statement she’d written in his voice? Something along the lines of:
Not at all, because it would have been her making claims about what she believes her brother would have said, and not a simulacrum of her brother speaking her words with his voice.
But that’s what she did. She was upfront about the fact that it was an AI video reciting a script that she’d written.
You can say that all you want, but when your brain is presented with a video of a person, using that person's voice, you're going to take what's being said as being from that person in the video.
True, many people would have that problem, which is why the context in which the video was shown was acceptable; it was after the verdict had been given.
Such a thing should not impact sentencing, either. The judge allowed it, the judge was swayed by it, it impacted sentencing. This is wrong.
This. I don’t see how it’s any different from making an ‘ai video’ about a murder victim thanking his murderer for easing his pain, in order to ‘make people feel better’ after a rich perpretrator games the system and is acquitted via dubious means. It’s blatant manipulation.
Wait but no, not like that, only the positive way I see it.
Is it reaaaalllly immoral if the kids just freakin’ love it though?
It sounds like it was played after a sentencing was given? Would be kind of sketchy if not.
It appears this was a Victim impact statement.
From the article (emphasizes mine):
Ah yes, appeals to emotion, my favorite part of the judicial process.
It feels like that’s the point of victim impact statements, even though it’s probably not supposed to be
This was played before sentencing. It doesn’t say it here, but the article I read earlier today stated that because of this video, the judge issued a sentence greater than the maximum recommended by the State. If true, then it really calls into question the sentence itself and how impartial the judge was.
Oh - then that’s fucked up. Synthesizing some narrative to potentially coerce an outcome seems like a slippery slope. (Not necessarily saying that’s exactly what happened here.)
If I am murdered please don’t do this. I do not care if you feel like it will help you process the events
Looking at the downvotes, remember upvoting an article ≠ an endorsement of the shitty technology being discussed in the article.
We shit on the technology in the comments, and upvote it so more of us can read about it and shit on it.
Maybe they like the technology and that’s why they’re downvoting the story.
I just can’t upvote this trash story even though you are correct about the usual reason for upvoting posts even when the subject matter is terrible.
That should never be allowed in court. What a crock of shit.
It was a victim impact statement, not subject to the rules of evidence. The shooter had already been found guilty, and this was an impact statement from the victim’s sister, to sway how the shooter should be sentenced. The victim’s bill of rights says that victims should be allowed to choose the method in which they make an impact statement, and his sister chose the AI video.
I agree that it shouldn’t be admissible as evidence. But that’s not really what’s being discussed here, because it wasn’t being used as evidence. The shooter was already found guilty.
Reminds me of the crime skeleton, shout out to anyone who knows what I’m talking about.
Who could forget truly an inventionbefore it’s time.
atlasobscura.com/…/criminal-confession-skeleton-p…
Thanks for sharing; I thought this was a fascinating read, especially since it ended on a positive note and not pure condemnation. It seems totally Black Mirror-esque, but I do wonder how many of the commentators here attacking it didn’t read the article. The family obviously didn’t make this decision lightly, given how much work it took to create it, and even the judge appreciated the novel approach. This is probably one of the best-case use scenarios relative to the abyss of unstoppable horror that awaits us.
Fascinating but also kind of creepy.
Perhaps; it seemed like they knew the decedent well enough to know that he would appreciate this, from everything that the article says. With that said, I also won’t be surprised if templates for wills or living trusts add a no-duplication statement by default over the coming years.
If my family hired an actor to impersonate me at my killer’s trial and give a prepared speech about how I felt about the situation it would be thrown out of court.
If my family hired a cartoonist or movie studio to create a moving scene with my face recreated by digital artists and a professional voice actor to talk about my forgiveness for my death, it would be thrown out of court.
That they used a generative program to do it and the Judge allowed the video to influence the sentence as if it were a statement by the deceased is deeply troubling.
Apparently, it was required to be allowed in that state:
From: sh.itjust.works/comment/18471175
From what I’ve seen, to be fair, judges’ decisions have varied wildly regardless, sadly, and sentences should be more standardized. I wonder what it would’ve been otherwise.
.