He has cancer — so he made an AI version of himself for his wife after he dies (www.npr.org)
from BevelGear@beehaw.org to technology@beehaw.org on 13 Jun 00:57
https://beehaw.org/post/14406448

Company he works at eternos.life

#technology

threaded - newest

autotldr@lemmings.world on 13 Jun 01:00 next collapse

🤖 I’m a bot that provides automatic summaries for articles:

Click here to see the summary

And my wife said, ‘Hey, one of the things I will miss most is being able to come to you, ask you a question, and you will sit there and calmly explain the world to me,’" he said. Then his friend called him up, saying he had an opportunity at his company Eternos.Life for Bommer to build an interactive AI version of himself. You’re reading the Consider This newsletter, which unpacks one major news story each day. AI has access to all sorts of knowledge, but his wife only wants to ask it questions that only Bommer would know the answers to. Normally, uploading this information would take weeks or months, but Bommer needed to put it together in just a few days. But when thinking about what questions she might end up asking this tool, once Bommer dies: "I assume perhaps to read me a poem. — Saved 72% of original text.

thingsiplay@beehaw.org on 13 Jun 01:25 next collapse

So it hurts long after his death.

naevaTheRat@lemmy.dbzer0.com on 13 Jun 01:25 next collapse

My wife is fortunately still alive so maybe that colours my view. However when I’ve lost other people the blessed anaesthesia of forgetting has been essential in being able to function.

From the short quote it seems like she maybe has a healthy-ish attitude but idk… I feel like this would be a shallow simulacrum that prolongs grief.

henfredemars@infosec.pub on 13 Jun 02:19 next collapse

I don’t believe humans are meant to manage loss in this way — stretching out an imitation of our loved one. As painful as it is, I personally believe humans need to say goodbye. I feel this gets in the way of feeling and truly accepting the loss so that a person can move forward.

Loss is truly heavy, but I do not believe this is better or healthy.

thingsiplay@beehaw.org on 13 Jun 02:27 next collapse

People who can’t get over someone losing will sorrow for the rest of the life, or until they get over it. And AI won’t help to get over it. Death is part of our life and as soon as you don’t accept it, it becomes pain.

It’s last year I think when I read someone created the lost son (or some other family member, I forgot) of a mother, in a VR environment. And she could see him/her again in the VR. Absolutely madness! What does this do to the person? Now couple that with an AI… man the future is grim…

henfredemars@infosec.pub on 13 Jun 02:35 collapse

I had this conversation with my wife once. I let her know that it is my advance wish that you must allow me to complete the cycle of life. Anything else, any reconstruction of me that technology allows, is to me, an abomination. Keep the pictures, keep the memories, but don’t keep me here when I am gone.

I refrain from judging the decisions of others where possible, but this is my personal wish.

naevaTheRat@lemmy.dbzer0.com on 13 Jun 02:32 next collapse

Yeah. I am not a Buddhist but I’ve always found something rings true in the reflections on impermanence. When we bond with someone we accept the pain of loss, and when we feel it most people seem to describe relief once able to “let go” an accept it being over.

It seems to me that encouraging clinging and reminiscening stunts you a bit and only really provides temporary relief of the loss while drawing out the time it takes to process it.

Idk though, maybe I’ll have the misfortune to feel differently some day. It’s hard to judge someone hanging out with their spouse watching death creep closer each day. I have approximately zero idea what my opinions would be in the face of that.

Mycatiskai@lemmy.ca on 13 Jun 04:27 next collapse

My sister has hundreds of YouTube videos she used to help her students learn between music lessons. It will be two years soon since she died, I haven’t been able to watch even one.

I like to remember her in my mind, it hurts less than seeing her when she was alive.

scrubbles@poptalk.scrubbles.tech on 13 Jun 05:24 next collapse

I tried things like character AI to play with talking to “celebrities”. It was novel, it was fun. For about 15 minutes. Then… Eh. It’s not the person, and your brain knows it’s not them. It’s always an imitation. I got bored talking with people I’ve always wanted to talk to.

I can’t imagine it being a lived one who has passed. It would feel hollow, empty, and wouldn’t make the pain leave. Idk, it just wouldn’t be good at all

FaceDeer@fedia.io on 13 Jun 03:54 next collapse

I don't believe humans are "meant" to do anything. We are a result of evolution, not intentional design. So I believe humans should do whatever they personally want to do in a situation like this.

If you have a loved one who does this and you don't feel comfortable interacting with their AI version, then don't interact with their AI version. That's on you. But don't belittle them for having preferences different from your own. Different people want different things and deal with death in different ways.

henfredemars@infosec.pub on 13 Jun 11:27 next collapse

Meant, in this context, refers to the conditions that humans have faced over a long period of time and may be more suited to coping with from a survival point of view. I’m an atheist, so I find it strange that you chose to read my comment as highlighting intentional design. Certainly, AI has existed for a much shorter time than the phenomenon on a human encountering the death of a loved one. Indeed, death has been quite a common theme throughout history, and the tools and support available to cope with it and relate to other human experiences far exceed those for coping with the potential issues that come with AI.

I think one can absolutely speak of needs and adaptation for something as common a human experience as death. If you find something belittling about that opinion, I’m not sure how to address you further. I may simply have to be wrong.

frog@beehaw.org on 13 Jun 13:31 collapse

Just gonna say that I agree with you on this. Humans have evolved over millions of years to emotionally respond to their environment. There’s certainly evidence that many of the mental health problems we see today, particularly at the scale we see, is in part due to the fact that we evolved to live in a very different way to our present lifestyles. And that’s not about living in cities rather than caves, but more to do with the amount of work we do each day, the availability and accessability of essential resources, the sense of community and connectedness with small social groups, and so on.

We know that death has been a constant of our existence for as long as life has existed, so it logically follows that dealing with death and grief is something we’ve evolved to do. Namely, we evolved to grieve for a member of our “tribe”, and then move on. We can’t let go immediately, because we need to be able to maintain relationships across brief separations, but holding on forever to a relationship that can never be continued would make any creature unable to focus on the needs of the present and future.

AI simulacrums of the deceased give the illusion of maintaining the relationship with the deceased. It is certainly well within the possibility that this will prolong the grieving process artificially, when the natural cycle of grieving is to eventually reach a point of acceptance. I don’t know for sure that’s what would happen… but I would want to be absolutely sure it’s not going to cause harm before unleashing this AI on the general public, particularly vulnerable people (which grieving people are.)

Although I say that about all AI, so maybe I’m biased by the ridiculous ideology that new technologies should be tested and regulated before vulnerable people are experimented on.

frog@beehaw.org on 13 Jun 13:17 next collapse

There may not have been any intentional design, but humans are still meant to eat food, drink water, and breathe oxygen, and going against that won’t lead to a good end.

FaceDeer@fedia.io on 13 Jun 21:05 collapse

Even with that, being absolutist about this sort of thing is wrong. People undergoing surgery have spent time on heart/lung machines that breathe for them. People sometimes fast for good reasons, or get IV fluids or nutrients provided to them. You don't see protestors outside of hospitals decrying how humans aren't meant to be kept alive with such things, though, at least not in most cases (as always there are exceptions, the Terri Schiavo case for example).

If I want to create an AI substitute for myself it is not anyone's right to tell me I can't because they don't think I was meant to do that.

frog@beehaw.org on 13 Jun 21:20 collapse

Sure, you should be free to make one. But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!), there are valid questions about whether that will cause them harm rather than help - and grieving people do not always make the most rational decisions. They can very easily be convinced that interacting with AI-you would be good for them, but it actually prolongs their grief and makes them feel worse. Grieving people are vulnerable, and I don’t think AI companies should be free to prey on the vulnerable, which is a very, very realistic outcome of this technology. Because that is what companies do.

So I think you need to ask yourself not whether you should have the right to make an AI version of yourself for those who survive your death… but whether you’re comfortable with the very likely outcome that an abusive company will use their memories of you to exploit their grief and prolong their suffering. Do you want to do that to people you care about?

Zaktor@sopuli.xyz on 13 Jun 23:42 collapse

This is speculation of corporate action completely divorced from the specifics of this technology and particulars of this story. The result of this could be a simple purchase either of hardware or software to be used as chosen by the person owning it. And the person commissioning it can specify exactly who such a simulacrum is presented to. None of this has to be under the power of the company that builds the simulacrums, and if it is structured that way, then that’s the problem that should be rejected or disallowed, not that this particular form of memento exists.

intensely_human@lemm.ee on 14 Jun 17:02 collapse

It could still be a bad idea even if the profit motive isn’t involved.

One might be trying to help with the big surprise stash of heroin they leave to their widow, and she might embrace it fully, but that doesn’t make it a good idea or good for her.

Zaktor@sopuli.xyz on 15 Jun 01:49 collapse

Sure, and that point is being made in multiple other places in these comments. I find it patronizing, but that’s neither here nor there as it’s not what this comment thread is about.

intensely_human@lemm.ee on 14 Jun 16:59 collapse

Recent science agrees sexual selection is a much bigger factor in recent human evolution than natural selection. And sexual selection is conscious.

So, depending on what you consider “design” we have at least been consciously bred for traits by previous generations of humans.

intensely_human@lemm.ee on 14 Jun 16:57 collapse

Yes. Nothing about this idea sounds like a good idea. Honestly I’m kind of pissed at the dude for saddling his wife with this gift.

I_am_10_squirrels@beehaw.org on 13 Jun 03:20 collapse

One of my colleagues has something along the lines of superior autobiographical recall. He remembers in great detail major and minor events from childhood to today. It’s difficult for him to forget.

I myself have forgotten long stretches of my life, and even looking at pictures of myself from those times it feels unfamiliar.

There are some things that I wish I could remember better, but overall I prefer my forgetful brain to his never forget brain.

intensely_human@lemm.ee on 14 Jun 17:05 collapse

I’ve got that biographical detail and it’s kind of weird being able to remember times with my friends that they can’t remember.

Just feels lonely. Like imagine being the only person who can remember more than an hour ago. How your life would feel different than those living within that 1-hour window.

It’s like that just with a different scale.

OsrsNeedsF2P@lemmy.ml on 13 Jun 01:30 next collapse

He posted online, telling his friends it was time to say goodbye. Then his friend called him up, saying he had an opportunity at his company Eternos.Life for Bommer to build an interactive AI version of himself.

It doesn’t get more tech bro than that

Zaktor@sopuli.xyz on 13 Jun 01:42 collapse

But in this case it seems like an entirely good thing? The offer was made by an actual friend, the guy himself wanted this, his wife too, and they’re both pretty cognizant about what this is and isn’t.

averyminya@beehaw.org on 13 Jun 09:48 next collapse

Yeah contrary to all the negativity about this in this thread, I think there’s a lot of worthwhile reasons for this that aren’t centered on fawning over the loss of a love one. Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes. These are all ways of keeping someone with us without making their death the main focus.

Yes, death and moving on are a part of life, we also always say to keep people alive in our hearts. I think there are plenty of ways to keep people around us alive without having them present, I don’t think an AI version of someone is inherently keeping your spirit from continuing on, nor is it inherently keeping your loved one from living in the moment.

Also I can’t help but think of the Star Trek computer but with this. When I was young I had a close gaming friend who we lost too soon, he was very much an announcer personality. He would have been perfect for being my voice assistant, and would have thought it to be hilarious.

Anyway, I definitely see plenty of downsides, don’t get me wrong. The potential for someone to wallow with this is high. I also think there’s quite a few upsides as mentioned – they aren’t ephemeral, but I think it’s somewhat fair to pick and choose good memories to pass down to remember. Quite a few old philosophical advents coming to fruition with tech these days.

frog@beehaw.org on 13 Jun 21:24 next collapse

Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes.

An AI isn’t going to magically know these things, because these aren’t AIs based on brain scans preserving the person’s entire mind and memories. They can learn only the data they’re told. And fortunately, there’s a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.

godzilla_lives@beehaw.org on 13 Jun 21:34 collapse

We have a box of old recipe cards from my grandmother that my wife cherishes. My parents gifted them to her because out of all their daughter-in-laws, she is the one that loves to cook and explore recipes the most. I just can’t imagine someone wanting something like that in a sterile technological aspect like an “AI-powered” app.

“But Trev, what if you used an LLM to generate summaries-” no, fuck off (he said to the hypothetical techbro in his ear).

frog@beehaw.org on 13 Jun 22:14 next collapse

I also suspect, based on the accuracy of AIs we have seen so far, that their interpretation of the deceased’s personality would not be very accurate, and would likely hallucinate memories or facts about the person, or make them “say” things they never would have said when they were alive. At best it would be very Uncanny Valley, and at worst would be very, very upsetting for the bereaved person.

godzilla_lives@beehaw.org on 13 Jun 22:50 next collapse

I have no doubts about that either, myself. Though even if such an abomination of a doppelganger were to exist, and it seems that these companies are hellbent on making it so, it would be worse for the reasons you described previously: prolonging and molesting the grieving process that human beings have evolved to go through. All in the name of a dollar. I apologize for being so bitter about this (this bitterness is not directed at you, frog), but this entire "AI’ phenomenon fucking disgusts and repulses me so much I want to scream.

frog@beehaw.org on 14 Jun 07:36 collapse

I absolutely, 100% agree with you. Nothing I have seen about the development of AI so far has suggested that the vast majority of its uses are grotesque. The few edge cases where it is useful and helpful don’t outweigh the massive harm it’s doing.

Zaktor@sopuli.xyz on 13 Jun 23:27 next collapse

This is a very patronizing view of people who all seem to be well informed about what this is and isn’t and who have already acknowledged that they will put it aside if it scares them. No one is foisting this on the bereaved wife and the husband has preemptively said it’s ok if her or her children never use it.

This might fail in all the ways you think it will. That’s a very small dataset of information, so it’s likely to be either be an overcomplicated recording or to need to incorporate training other than what he personally said, but it’s not your place to tell her what’s best for her personal grieving process.

frog@beehaw.org on 14 Jun 07:30 collapse

Given the husband is likely going to die in a few weeks, and the wife is likely already grieving for the man she is shortly going to lose, I think that still places both of them into the “vulnerable” category, and the owner of this technology approached them while they were in this vulnerable state. So yes, I have concerns, and the fact that the owner is allegedly a friend of the family (which just means they were the first vulnerable couple he had easy access to, in order to experiment on) doesn’t change the fact that there are valid concerns about the exploitation of grief.

With the way AI techbros have been behaving so far, I’m not willing to give any of them the benefit of the doubt about claims of wanting to help rather than make money - such as using a vulnerable couple to experiment on while making a “proof of concept” that can be used to sell this to other vulnerable people.

Zaktor@sopuli.xyz on 14 Jun 09:58 collapse

So just more patronizing. It’s their life, you don’t know better than them how to live it, grief or no.

frog@beehaw.org on 14 Jun 13:49 collapse

Nope, I’m just not giving the benefit of the doubt to the techbro who responded to a dying man’s farewell posts online with “hey, come use my untested AI tool!”

intensely_human@lemm.ee on 14 Jun 17:11 collapse

I think it would be the opposite of upsetting, but in an unhealthy way. I think it would snap them out of their grief into a place of strangeness, and theyd stop feeling their feelings.

There is no cell of my gut that likes this idea.

frog@beehaw.org on 14 Jun 21:02 collapse

Yeah, I think you could be right there, actually. My instinct on this from the start is that it would prevent the grieving process from completing properly. There’s a thing called the gestalt cycle of experience where there’s a normal, natural mechanism for a person going through a new experience, whether it’s good and bad, and a lot of unhealthy behaviour patterns stem from a part of that cycle being interrupted - you need to go through the cycle for everything that happens in your life, reaching closure so that you’re ready for the next experience to begin (most basic explanation), and when that doesn’t happen properly, it creates unhealthy patterns that influence everything that happens after that.

Now I suppose, theoretically, there’s a possibility that being able to talk to an AI replication of a loved one might give someone a chance to say things they couldn’t say before the person died, which could aid in gaining closure… but we already have methods for doing that, like talking to a photo of them or to their grave, or writing them a letter, etc. Because the AI still creates the sense of the person still being “there”, it seems more likely to prevent closure - because that concrete ending is blurred.

Also, your username seems really fitting for this conversation. :)

averyminya@beehaw.org on 14 Jun 06:36 collapse

I more meant in the case of someone whose life was cut short and didn’t have the time to put something like this together. I agree that ideally this is information you’d get to pass down, but life doesn’t always work out like that.

Also like you said about the AI powered app, it’s only a matter of time before Adobe Historical Life comes out and we’re paying $90 a month for gramma’s recipes (stories are an additional subscription).

intensely_human@lemm.ee on 14 Jun 17:10 collapse

I went back and read old emails from my mother who died in 2009. I had unread emails from her.

One of them contained my grandmother’s peanut butter cookie recipe, which I thought was lost when she passed in 2003.

It might have been nice if an LLM had found that instead of me, but it felt very amazing to discover it myself.

Zaktor@sopuli.xyz on 14 Jun 00:07 next collapse

This is a weirdly “you should only do things the natural way” comment section for a Tech-based community.

Humans also weren’t “meant” to be on social media, or recording videos of themselves, or even building shrines or gravesites for their loved ones. They’re just practices that have sprung up as technology and culture change. This very well could be an impediment to her moving on without him, but that’s her choice to make, and all this appeal to tradition is patronizing and doesn’t actually mean tradition is the right path for any given individual. The only right way to process death is:

  • Burn their body and possessions so that no trace remains
  • Pump their body full of chemicals so they won’t be decomposing when people ceremonially visit their corpse weeks later
  • Entomb them with their cats, slaves, and riches
  • Plant a tree nourished by their decomposing corpse
  • Turn their ashes into a piece of jewelry to be carried with you always
  • Make a shrine to the dead in your home to be prayed at regularly
  • Cast a death mask to more accurately sculpt their bust
  • Freeze their head so they may be resurrected later
intensely_human@lemm.ee on 14 Jun 17:08 collapse

Think of how many family recipes could be preserved

We solved this problem long before we invented writing.

LLMs do not enable the keeping of family memories. That’s been going on a long time.

intensely_human@lemm.ee on 14 Jun 17:06 collapse

and they’re both pretty cognizant about what this is and isn’t

This will be communicating with a dead person. Nobody has any idea what this and what it isn’t.

It’s like planning to go to Morocco and thinking you know in advance what it’s gonna be like.

This is new technology. People who think they know the outcomes here are deluding themselves.

JoMiran@lemmy.ml on 13 Jun 03:12 next collapse

It made me think of this old Michael Keaton movie, “My Life”, in which he leaves a treasure trove of video tapes to his unborn child.

jlow@beehaw.org on 13 Jun 03:38 next collapse

en.m.wikipedia.org/wiki/Be_Right_Back

Black Mirror is not an instruction manual, people. Quite the opposite. Can we stop trying to make every episode real?

FaceDeer@fedia.io on 13 Jun 03:50 next collapse

If you don't want to do it then don't do it. Can we stop trying to tell everyone else they have to have the same values as you?

Kolanaki@yiffit.net on 13 Jun 05:28 next collapse

Maybe they were inspired Mulholland Drive instead.

stick2urgunz88@lemm.ee on 13 Jun 19:22 next collapse

This was my first thought. How bout let’s not try to recreate the dystopian fictional TV show.

intensely_human@lemm.ee on 14 Jun 16:55 collapse

We’re not “trying to make every episode real”. Technology’s direction and human foibles are predictable. Black Mirror writers just aren’t blind and have a good sense of what’s coming down the pipeline.

That’s why it’s called Mirror. It’s about showing us who we are.

Sorry if that’s too horrifying for you, but this goes way beyond imitating the last person to mention these problems.

muse@fedia.io on 13 Jun 01:57 next collapse

"I'll never forget the sweet romantic words he said to me last night: 'As a learning language model, I am unable to comprehend what the feeling love is. Here is a list of love songs from Wikipedia.'"

[deleted] on 13 Jun 04:31 next collapse

.

podperson@lemm.ee on 13 Jun 20:18 collapse

Hi honey, here’s Despacito…

Psych@lemmy.sdf.org on 13 Jun 06:45 next collapse

Guy going full on pantheon .

dubyakay@lemmy.ca on 14 Jun 02:27 collapse

Ugh the brain “scan” though. I think that’s bs. At least in the show, not sure about the short story.

padlock4995@lemmy.ml on 13 Jun 20:39 next collapse

Transcendence and Black mirror both ended really well. Keep at it. T200’s next.

Marin_Rider@aussie.zone on 14 Jun 02:40 next collapse

literally the plot of Caprica

Cethin@lemmy.zip on 14 Jun 07:02 collapse

Sadly, no one knows the plot of Caprica because we’re the only two people in the world who watched it. It’s impressive how well BSG was received and is remembered and most people don’t even know Caprica exists.

Marin_Rider@aussie.zone on 14 Jun 07:15 collapse

it is a shame. i wished they were able to make another season but i guess just for the 2 of us it wouldnt have made much sense!

shasta@lemm.ee on 14 Jun 21:45 next collapse

Yeah that seems healthy

Simulation6@sopuli.xyz on 15 Jun 00:47 collapse

Why not make an Ai that can look for a cure to his cancer?