Disturbing AI Images of Children Found for Sale on Shutterstock (petapixel.com)
from stopthatgirl7@kbin.social to technology@lemmy.world on 23 Feb 2024 02:59
https://kbin.social/m/technology@lemmy.world/t/853805

Someone used Shutterstock's AI image generator to create them.

#technology

threaded - newest

db2@lemmy.world on 23 Feb 2024 03:04 next collapse

I’m curious what they call disturbing, but also don’t want to see in case they’re right.

ChihuahuaOfDoom@lemmy.world on 23 Feb 2024 03:13 collapse

The article doesn’t mention nudity but what they described is still pretty fucked.

Just_Pizza_Crust@lemmy.world on 23 Feb 2024 03:45 collapse

I think it really depends on what “young girl” means in this context. The title says “children”, but nowhere in the article does it say that. So I’m unsure if this is another AI-boogyman article, or something else.

stopthatgirl7@kbin.social on 23 Feb 2024 06:18 collapse

A “young girl” would be a “child,” And multiple young girls would be children. 🤨

douglasg14b@lemmy.world on 23 Feb 2024 08:56 next collapse

Theres a link below of a “you g girl” on the toilet.

It appears to be a young adult, clothed, using a toilet as a seat. Idk why it’s labeled the way it is, it’s really weird.

However , that somewhat dilutes the notion that that means children on this site.

Just_Pizza_Crust@lemmy.world on 23 Feb 2024 12:44 collapse

That’s pure speculation on your part.

Like another person said, the “young girl” on the toilet looks to be a woman well into her 20s.

andrew@lemmy.stuart.fun on 23 Feb 2024 03:19 next collapse

A note on the page warns, “Shutterstock does not review AI-generated content for compliance with Shutterstock’s content compliance standards.” Adding that users must not generate imagery that is “false, misleading, deceptive, harmful, or violent.”

“Pls don’t be bad mmkay?”

“We’ve done all we possibly can.”

Zak@lemmy.world on 23 Feb 2024 03:30 next collapse

This may be controversial, but I don’t care what kind of AI-generated images people create as long as it’s obvious they’re not reality. Where I worry is the creation of believable false narratives, from explicit deepfakes of real people to completely fictional newsworthy events.

[deleted] on 23 Feb 2024 05:04 next collapse

.

StunningGoggles@sh.itjust.works on 23 Feb 2024 18:01 next collapse

I’ve read that pedophiles are more likely to act out on their urges if they have access to real images. I would guess that this also applies for ai generated images too, even if they don’t look 100% real, but I could be wrong on that. Whatever stops them from abusing kids is what I’m for.

Zak@lemmy.world on 23 Feb 2024 19:16 collapse

I want to say research on the subject has been inconclusive overall. I’d certainly update my view given convincing evidence that fictional images lead to abuse of real children.

Of course, none of that has anything to do with the non-explicit video linked elsewhere in this thread of an adult woman using the toilet.

HubertManne@kbin.social on 24 Feb 2024 02:52 collapse

I agree here. Im not worried about imaginary things except for their ability to appear like actual things and mess with truth.

ABCDE@lemmy.world on 23 Feb 2024 03:43 next collapse

There’s some pretty weird stuff on there, like kids taking a bath and someone on the toilet: shutterstock.com/…/clip-26807341-woman-sitting-on…

tourist@lemmy.world on 23 Feb 2024 08:07 collapse

<img alt="" src="https://lemmy.world/pictrs/image/7aa7d20a-9d7f-429f-9511-763cde48c2f2.jpeg">

Irinir@lemmy.world on 23 Feb 2024 08:47 collapse

Nope nope nope. Not even risking it.

[deleted] on 23 Feb 2024 04:34 next collapse

.

just_another_person@lemmy.world on 23 Feb 2024 05:53 next collapse

Called it

GBU_28@lemm.ee on 23 Feb 2024 06:17 next collapse

In before someone defend pedophiles, oh wait

Scubus@sh.itjust.works on 23 Feb 2024 08:19 collapse

Please, the Catholic church are heroes

Edit: wow, didn’t think I needed the /s since I was directly linking the Catholic church with pedophilia

kudu@lemmy.world on 23 Feb 2024 08:33 collapse

Why the fuck does these AI know how to generate this shit of children 😕

photonic_sorcerer@lemmy.dbzer0.com on 23 Feb 2024 11:23 collapse

These image generating AIs don’t need to have been trained on what you want it to output. If you tell it to generate a banana car, it doesn’t need to have seen a real banana car before, it just knows what a banana and a car look like, and combines them. Similarly, such AIs would know what naked humans look like and what kids look like.

kudu@lemmy.world on 23 Feb 2024 14:33 collapse

I see, thanks for the explaination