this post was submitted on 13 Jun 2024
101 points (100.0% liked)

Technology

37747 readers
187 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Company he works at eternos.life

you are viewing a single comment's thread
view the rest of the comments
[–] Zaktor@sopuli.xyz 21 points 5 months ago (2 children)

But in this case it seems like an entirely good thing? The offer was made by an actual friend, the guy himself wanted this, his wife too, and they're both pretty cognizant about what this is and isn't.

[–] averyminya@beehaw.org 6 points 5 months ago (3 children)

Yeah contrary to all the negativity about this in this thread, I think there's a lot of worthwhile reasons for this that aren't centered on fawning over the loss of a love one. Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you'd easily forget as time passes. These are all ways of keeping someone with us without making their death the main focus.

Yes, death and moving on are a part of life, we also always say to keep people alive in our hearts. I think there are plenty of ways to keep people around us alive without having them present, I don't think an AI version of someone is inherently keeping your spirit from continuing on, nor is it inherently keeping your loved one from living in the moment.

Also I can't help but think of the Star Trek computer but with this. When I was young I had a close gaming friend who we lost too soon, he was very much an announcer personality. He would have been perfect for being my voice assistant, and would have thought it to be hilarious.

Anyway, I definitely see plenty of downsides, don't get me wrong. The potential for someone to wallow with this is high. I also think there's quite a few upsides as mentioned -- they aren't ephemeral, but I think it's somewhat fair to pick and choose good memories to pass down to remember. Quite a few old philosophical advents coming to fruition with tech these days.

[–] frog@beehaw.org 11 points 5 months ago (1 children)

Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes.

An AI isn't going to magically know these things, because these aren't AIs based on brain scans preserving the person's entire mind and memories. They can learn only the data they're told. And fortunately, there's a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.

[–] godzilla_lives@beehaw.org 10 points 5 months ago (2 children)

We have a box of old recipe cards from my grandmother that my wife cherishes. My parents gifted them to her because out of all their daughter-in-laws, she is the one that loves to cook and explore recipes the most. I just can't imagine someone wanting something like that in a sterile technological aspect like an "AI-powered" app.

"But Trev, what if you used an LLM to generate summaries-" no, fuck off (he said to the hypothetical techbro in his ear).

[–] frog@beehaw.org 8 points 5 months ago (3 children)

I also suspect, based on the accuracy of AIs we have seen so far, that their interpretation of the deceased's personality would not be very accurate, and would likely hallucinate memories or facts about the person, or make them "say" things they never would have said when they were alive. At best it would be very Uncanny Valley, and at worst would be very, very upsetting for the bereaved person.

[–] Zaktor@sopuli.xyz 3 points 5 months ago (1 children)

This is a very patronizing view of people who all seem to be well informed about what this is and isn't and who have already acknowledged that they will put it aside if it scares them. No one is foisting this on the bereaved wife and the husband has preemptively said it's ok if her or her children never use it.

This might fail in all the ways you think it will. That's a very small dataset of information, so it's likely to be either be an overcomplicated recording or to need to incorporate training other than what he personally said, but it's not your place to tell her what's best for her personal grieving process.

[–] frog@beehaw.org 4 points 5 months ago* (last edited 5 months ago) (1 children)

Given the husband is likely going to die in a few weeks, and the wife is likely already grieving for the man she is shortly going to lose, I think that still places both of them into the "vulnerable" category, and the owner of this technology approached them while they were in this vulnerable state. So yes, I have concerns, and the fact that the owner is allegedly a friend of the family (which just means they were the first vulnerable couple he had easy access to, in order to experiment on) doesn't change the fact that there are valid concerns about the exploitation of grief.

With the way AI techbros have been behaving so far, I'm not willing to give any of them the benefit of the doubt about claims of wanting to help rather than make money - such as using a vulnerable couple to experiment on while making a "proof of concept" that can be used to sell this to other vulnerable people.

[–] Zaktor@sopuli.xyz 1 points 5 months ago* (last edited 5 months ago) (1 children)

So just more patronizing. It's their life, you don't know better than them how to live it, grief or no.

[–] frog@beehaw.org 2 points 5 months ago

Nope, I'm just not giving the benefit of the doubt to the techbro who responded to a dying man's farewell posts online with "hey, come use my untested AI tool!"

[–] godzilla_lives@beehaw.org 2 points 5 months ago (1 children)

I have no doubts about that either, myself. Though even if such an abomination of a doppelganger were to exist, and it seems that these companies are hellbent on making it so, it would be worse for the reasons you described previously: prolonging and molesting the grieving process that human beings have evolved to go through. All in the name of a dollar. I apologize for being so bitter about this (this bitterness is not directed at you, frog), but this entire "AI' phenomenon fucking disgusts and repulses me so much I want to scream.

[–] frog@beehaw.org 2 points 5 months ago

I absolutely, 100% agree with you. Nothing I have seen about the development of AI so far has suggested that the vast majority of its uses are grotesque. The few edge cases where it is useful and helpful don't outweigh the massive harm it's doing.

[–] intensely_human@lemm.ee 1 points 5 months ago (1 children)

I think it would be the opposite of upsetting, but in an unhealthy way. I think it would snap them out of their grief into a place of strangeness, and theyd stop feeling their feelings.

There is no cell of my gut that likes this idea.

[–] frog@beehaw.org 1 points 5 months ago

Yeah, I think you could be right there, actually. My instinct on this from the start is that it would prevent the grieving process from completing properly. There's a thing called the gestalt cycle of experience where there's a normal, natural mechanism for a person going through a new experience, whether it's good and bad, and a lot of unhealthy behaviour patterns stem from a part of that cycle being interrupted - you need to go through the cycle for everything that happens in your life, reaching closure so that you're ready for the next experience to begin (most basic explanation), and when that doesn't happen properly, it creates unhealthy patterns that influence everything that happens after that.

Now I suppose, theoretically, there's a possibility that being able to talk to an AI replication of a loved one might give someone a chance to say things they couldn't say before the person died, which could aid in gaining closure... but we already have methods for doing that, like talking to a photo of them or to their grave, or writing them a letter, etc. Because the AI still creates the sense of the person still being "there", it seems more likely to prevent closure - because that concrete ending is blurred.

Also, your username seems really fitting for this conversation. :)

[–] averyminya@beehaw.org 2 points 5 months ago (1 children)

I more meant in the case of someone whose life was cut short and didn't have the time to put something like this together. I agree that ideally this is information you'd get to pass down, but life doesn't always work out like that.

Also like you said about the AI powered app, it's only a matter of time before Adobe Historical Life comes out and we're paying $90 a month for gramma's recipes (stories are an additional subscription).

[–] intensely_human@lemm.ee 3 points 5 months ago

I went back and read old emails from my mother who died in 2009. I had unread emails from her.

One of them contained my grandmother’s peanut butter cookie recipe, which I thought was lost when she passed in 2003.

It might have been nice if an LLM had found that instead of me, but it felt very amazing to discover it myself.

[–] intensely_human@lemm.ee 4 points 5 months ago

Think of how many family recipes could be preserved

We solved this problem long before we invented writing.

LLMs do not enable the keeping of family memories. That’s been going on a long time.

[–] Zaktor@sopuli.xyz 2 points 5 months ago

This is a weirdly "you should only do things the natural way" comment section for a Tech-based community.

Humans also weren't "meant" to be on social media, or recording videos of themselves, or even building shrines or gravesites for their loved ones. They're just practices that have sprung up as technology and culture change. This very well could be an impediment to her moving on without him, but that's her choice to make, and all this appeal to tradition is patronizing and doesn't actually mean tradition is the right path for any given individual. The only right way to process death is:

  • Burn their body and possessions so that no trace remains
  • Pump their body full of chemicals so they won't be decomposing when people ceremonially visit their corpse weeks later
  • Entomb them with their cats, slaves, and riches
  • Plant a tree nourished by their decomposing corpse
  • Turn their ashes into a piece of jewelry to be carried with you always
  • Make a shrine to the dead in your home to be prayed at regularly
  • Cast a death mask to more accurately sculpt their bust
  • Freeze their head so they may be resurrected later
[–] intensely_human@lemm.ee 1 points 5 months ago

and they're both pretty cognizant about what this is and isn't

This will be communicating with a dead person. Nobody has any idea what this and what it isn’t.

It’s like planning to go to Morocco and thinking you know in advance what it’s gonna be like.

This is new technology. People who think they know the outcomes here are deluding themselves.