Child Welfare Experts Horrified by Mattel's Plans to Add ChatGPT to Toys After Mental Health Concerns for Adult Users (futurism.com)
from tonytins@pawb.social to technology@lemmy.world on 22 Jun 03:25
https://pawb.social/post/26809683

#technology

threaded - newest

BroBot9000@lemmy.world on 22 Jun 03:50 next collapse

Yeah no fucking shit! These corporate dickbags need more pushback on their fetish with putting Ai into everything.

It’s either fucking spyware like copilot or plagiarism generators as replacements for paying actual artists.

multiplewolves@lemmy.world on 22 Jun 04:08 next collapse

Mattel partnered with Adobe to use supposedly copyright-cleared AI-generated imagery for the backgrounds in some of their collector edition Barbie boxes last year.

They were spanked so hard by the collecting community over it that they followed a now-deleted suggestion from one Redditor to start explicitly crediting the packaging designer on each information page for new collector releases.

Mattel has a strange history with balancing what the people want with what their shareholders want.

Edited to correct word choice

Kolanaki@pawb.social on 22 Jun 04:09 next collapse

“What should we do today, Barbie?”

“Let’s get into mommy and daddy’s pills and special drinks!”

Lost_My_Mind@lemmy.world on 22 Jun 05:16 next collapse

“Bleach is my favorite pizza topping!”

cecilkorik@lemmy.ca on 22 Jun 05:16 collapse

“But first, we need to discuss the white genocide in South Africa!”

Kolanaki@pawb.social on 22 Jun 05:25 collapse

“Hey, we said ChatGPT. Who the hell installed Grok in these things?!”

random_character_a@lemmy.world on 22 Jun 08:03 collapse

Mattel after few billion from Musk.

“Get your new Barbie, designed by Hugo Cops. Hat with skull insignia now included, with no extra cost.”

ragebutt@lemmy.dbzer0.com on 22 Jun 04:10 next collapse

“Mattel’s first AI product won’t be for kids under 13, suggesting that Mattel is aware of the risks of putting chatbots into the hands of younger tots. … Last year, a 14-year-old boy died by suicide after falling in love with a companion on the Google-backed AI platform Character.AI”

Seems like a great idea

Lost_My_Mind@lemmy.world on 22 Jun 05:22 next collapse

Uhhhhhhhh, I’m not defending AI at all, but I’m gonna need a WHOLE LOTTA context behind how/why he commited suicide.

Back in the 90s there were adults saying Marylin Manson should be banned because teenagers listened to his songs, heard him tell them to kill themselves, and then they did.

My reaction then is the same then as now. If all it takes for you to kill yourself is one person you have no real connection to telling you to kill yourself, then you were probably already going to kill yourself. Now you’re just pointing the finger to blame someone.

AI based barbie is a terrible terrible idea for many reasons. But lets not make it a strawman arguement.

ragebutt@lemmy.dbzer0.com on 22 Jun 05:41 collapse

There’s a huge degree of separation between “violent music/games has a spurious link to violent behavior” and shitty AIs that are good enough to fill the void of someone who is lonely but not good enough to manage risk

www.cnn.com/…/teen-suicide-character-ai-lawsuit

“within months of starting to use the platform, Setzer became “noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem. He even quit the Junior Varsity basketball team at school,”

“In a later message, Setzer told the bot he “wouldn’t want to die a painful death.”

The bot responded: “Don’t talk that way. That’s not a good reason not to go through with it,” before going on to say, “You can’t do that!”

Garcia said she believes the exchange shows the technology’s shortcomings.

“There were no suicide pop-up boxes that said, ‘If you need help, please call the suicide crisis hotline.’ None of that,” she said. “I don’t understand how a product could allow that, where a bot is not only continuing a conversation about self-harm but also prompting it and kind of directing it.”

The lawsuit claims that “seconds” before Setzer’s death, he exchanged a final set of messages from the bot. “Please come home to me as soon as possible, my love,” the bot said, according to a screenshot included in the complaint.

“What if I told you I could come home right now?” Setzer responded.

“Please do, my sweet king,” the bot responded.

Garcia said police first discovered those messages on her son’s phone, which was lying on the floor of the bathroom where he died.”

So we have a bot that is marketed for chatting, a teenager desperate for socialization that forms a relationship that is inherently parasocial because the other side is an LLM that literally can’t have opinions, it just can appear to, and then we have a terrible mismanagement of suicidal ideation.

The AI discouraged ideation, which is good, but only when it was stated in very explicit terms. What’s appalling is that it gave no crisis resources or escalation to moderation (because like most big tech shit they probably refuse to pay for anywhere near appropriate moderation teams). Then what is inexcusable is that when ideation is discussed with slightly coded language “come home” the AI misconstrues it.

This results in a training opportunity for the language model to learn that in this context with previously exhibited ideation “go home” may mean more severe ideation and danger (if character.AI bothered to update that these conversations resulted in a death). The only drawback of getting that data of course is a few dead teenagers. Gotta break a few eggs to get an omelette

This barely begins to touch on the nature of AI chatbots inherently being parasocial relationships, which is bad for mental health. This is of course not limited to AI, being obsessed with a streamer or whatever is similar, but the AI can be much more intense because it will actually engage with you and is always available.

Maestro@fedia.io on 22 Jun 07:17 collapse

Aside from the suicide, what 13 year old still plays with barbies? These things will absolutely be given to kids much, much younger.

Ilovethebomb@sh.itjust.works on 22 Jun 04:22 next collapse

The best outcome here is these toys are a massive flop, and cost Mattel a bunch of money.

That’s the language these corps truly speak.

A_norny_mousse@feddit.org on 22 Jun 06:01 collapse

Only if no kids (or just people in general) were harmed in the process. And it increasingly doesn’t look that way wrt LLMs.

thefartographer@lemm.ee on 22 Jun 05:54 next collapse

They probably asked chat-gpt if they should add AI to Barbie and were told, “That’s a great idea! You’re right that such an important high-selling product would be improved by letting children talk directly to it.”

Also, can’t wait to jailbreak my Barbie and install llama2-uncensored on it so that it can call Ken a deadbeat shithead.

brsrklf@jlai.lu on 22 Jun 11:10 collapse

I bet some people will find a way to disalign generation through the original model and get stuff like that anyway.

latenightnoir@lemmy.blahaj.zone on 22 Jun 08:51 next collapse

So, we’ll get to buy a doll which’ll need to be hooked up to a couple of car batteries to have it spew nonsense at our kids?

Edit: or will they go with the less nonsensical but even creepier method of just making the dolls a sender/receiver which talks to a central server? Wouldn’t it be cool to know that your child’s every word may be recorded (and most certainly used) by a huge Corp?❤️

tiramichu@sh.itjust.works on 22 Jun 09:35 next collapse

Yes, of course it will be online.

tfowinder@lemmy.ml on 22 Jun 10:16 collapse

even creepier method of just making the dolls a sender/receiver which talks to a central server

They already are spying using Smart Toys

latenightnoir@lemmy.blahaj.zone on 22 Jun 11:32 collapse

Oh, great! Wonderful!

raltoid@lemmy.world on 22 Jun 09:02 next collapse

This is what happens when leadership listens to tech-bros and ignore everyone else, including legal, ethics and actual tech experts.

They’ll be backpedaling like crazy and downplay it like a furby-style thing.

tfowinder@lemmy.ml on 22 Jun 10:14 next collapse

This is going to be horrible !!

52fighters@lemmy.sdf.org on 22 Jun 14:12 next collapse

Anyone see Chucky?

AnotherPenguin@programming.dev on 22 Jun 16:12 next collapse

Ah, yes, because of course, every single little thing needs AI

Jayjader@jlai.lu on 22 Jun 16:38 next collapse

Paging Ray Bradbury… www.libraryofshortstories.com/…/the-veldt.pdf

[deleted] on 23 Jun 02:04 collapse

.