AI-generated child sexual abuse images could flood the internet. A watchdog is calling for action (apnews.com)
from robocall@lemmy.world to technology@lemmy.world on 25 Oct 2023 21:31
https://lemmy.world/post/7331591

#technology

threaded - newest

autotldr@lemmings.world on 25 Oct 2023 21:35 next collapse

This is the best summary I could come up with:


NEW YORK (AP) — The already-alarming proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos, a watchdog agency warned on Tuesday.

In a written report, the U.K.-based Internet Watch Foundation urges governments and technology providers to act quickly before a flood of AI-generated images of child sexual abuse overwhelms law enforcement investigators and vastly expands the pool of potential victims.

In a first-of-its-kind case in South Korea, a man was sentenced in September to 2 1/2 years in prison for using artificial intelligence to create 360 virtual child abuse images, according to the Busan District Court in the country’s southeast.

What IWF analysts found were abusers sharing tips and marveling about how easy it was to turn their home computers into factories for generating sexually explicit images of children of all ages.

While the IWF’s report is meant to flag a growing problem more than offer prescriptions, it urges governments to strengthen laws to make it easier to combat AI-generated abuse.

Users can still access unfiltered older versions of Stable Diffusion, however, which are “overwhelmingly the software of choice … for people creating explicit content involving children,” said David Thiel, chief technologist of the Stanford Internet Observatory, another watchdog group studying the problem.


The original article contains 1,013 words, the summary contains 223 words. Saved 78%. I’m a bot and I’m open source!

BombOmOm@lemmy.world on 25 Oct 2023 21:37 next collapse

Drawn art depicting minors in sexual situations has been deemed protected as free speech in the US. It’s why, at least in the US, you don’t have to worry about the anime girl that’s 17 landing you in prison on child porn charges. The reasoning: there is no victim, the anime girl is not sentient, therefore the creation of that art is protected as free speech.

I suspect a similar thing will happen with this. As long as it is not depicting a real person, the completely invented person is not sentient, there is no victim, this will fall under free speech. At least in the US.

However, it is likely a very, very bad idea to have any photo-realistic art of this manner, as it may not be clear to authorities if it is from AI or if there is in fact a person you are victimizing. Doubly so if you download this from someone else, as you don’t know if that is a real person either.

fubo@lemmy.world on 25 Oct 2023 21:55 next collapse

Deepfakes of an actual child should be considered defamatory use of a person’s image; but they aren’t evidence of actual abuse the way real CSAM is.

Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)

But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.

Pyro@pawb.social on 25 Oct 2023 22:24 next collapse

Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won’t need “real”

Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

fubo@lemmy.world on 25 Oct 2023 23:10 next collapse

Some actually fetishize causing suffering.

JohnEdwa@sopuli.xyz on 26 Oct 2023 12:14 collapse

Some people are sadists and rapists, yes, regardless of what age group they’d want to do it with.

hoshikarakitaridia@sh.itjust.works on 26 Oct 2023 05:02 next collapse

Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting “artificial CSAM”.

topinambour_rex@lemmy.world on 26 Oct 2023 12:20 collapse

Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

Have you got some source about this ?

Uranium3006@kbin.social on 25 Oct 2023 23:41 collapse

Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

although that distinction lasted about a week before the same bad actors who cancel people over incest fanfic started calling all the latter CSEM too

fubo@lemmy.world on 25 Oct 2023 23:45 collapse

As a sometime fanfic writer, I do notice when fascists attack sites like AO3 under a pretense of “protecting children”, yes.

Uranium3006@kbin.social on 26 Oct 2023 01:01 collapse

And it's usually fascists, or at least people who may not consider themselves as such but think and act like fascists anyways.

HubertManne@kbin.social on 25 Oct 2023 22:20 next collapse

this is sorta a problem in regular porn. Im not sure if the acting improved but sometimes im turned off because im not sure if the acts are not in some way cohereced. Especially given some of the stuff recently with modeling things were they take their passports and shit.

gregorum@lemm.ee on 25 Oct 2023 23:23 collapse

Yeah, I get turned off by porn even if the actors don’t seem all that into it. “Possibly coerced” sets off alarms, although I don’t really run across that hardly ever.

BrianTheeBiscuiteer@lemmy.world on 25 Oct 2023 23:45 next collapse

Possession of CSAM that’s 20 years old (i.e. the subject is now adult) or even 100 years old (i.e. the subject is likely deceased) is not legal. You don’t have to pay for it or create it, just possess it. Yeah, they’ll find a way to prosecute for images of non-existent children.

CptBread@lemmy.world on 26 Oct 2023 00:21 collapse

Anything that looks realistic should be illegal if you ask me as otherwise it would become harder to prosecuting real child porn. “Oh that picture is just modified with AI” could be hard to dissprove…

BreakDecks@lemmy.ml on 26 Oct 2023 00:27 next collapse

We shouldn’t be prosecuting people because they have things that look like child porn, we have to prove that there’s a victim, or people get accused of things they didn’t do.

midsouthcriminaldefense.com/…/porn-star-appears-i…

logicbomb@lemmy.world on 26 Oct 2023 01:23 collapse

We shouldn’t be prosecuting these people, but we should be figuring out how to get them help.

An adult person who is attracted to children can obviously not have any legal sexual contact with a child, just like anybody else, and so we need to make sure they have the tools and ability to get by without that.

I don’t know what’s best for these people. Maybe the best way to help them is to let them have this fake material. Maybe the best way to help them is to try to deny them this sort of material. There’s probably some scientist out there who has studied what is the best thing.

Doomsider@lemmy.world on 26 Oct 2023 03:40 collapse

Allowing someone to act out on their deranged fantasies just results in reinforcing this behavior. No, it would not help them.

We learned in the early eighties that allowing people to scream, tear up stuff, and generally destroy things that it did not help them move past their feelings of anger. If you hit things to deal with anger it becomes a feedback loop of hitting more things more often to deal with the emotion.

Supermariofan67@programming.dev on 26 Oct 2023 02:31 collapse

Exactly that has already been tried, and struck down by the supreme Court in Ashcroft v. Free Speech Coalition. It turns out, porn of people over 18 very often looks the same as porn of people under 18, therefore such a law bans a considerable amount of legal adult content.

systemglitch@lemmy.world on 25 Oct 2023 22:06 next collapse

Impossible to stop. Good luck with AI in the future fellow humans.

Nurse_Robot@lemmy.world on 26 Oct 2023 00:11 next collapse

I think the biggest worry for me at this point is what the AI trained on in order to depict these images. It’s not victimless if it needs victims of child abuse to train on

Edit: really fucking weird I’m getting down voted for being against AI training on child porn. I’m willing to go down with that ship.

Chozo@kbin.social on 26 Oct 2023 00:41 next collapse

It knows what naked people look like, and it knows what children look like. It doesn't need naked children to fill in those gaps.

Also, these models are trained with images scraped from the clear net. Somebody would have to had manually added CSAM to the training data, which would be easily traced back to them if they did. The likelihood of actual CSAM being included in any mainstream AI's training material is slim to none.

Nurse_Robot@lemmy.world on 26 Oct 2023 02:32 next collapse

Defending AI generated child porn is a weird take, and the support you’re receiving is even more concerning

Chozo@kbin.social on 26 Oct 2023 08:24 collapse

I'm not defending it, dipshit. I'm explaining how generative AI training works.

The fact that you can't see that is what's really concerning.

BetaDoggo_@lemmy.world on 26 Oct 2023 02:55 collapse

There is likely some csam in most of the models as filtering it out of a several billion image set is nearly impossible even with automated methods. This material likely has little to no effect on outputs however since it’s likely scarce and was probably tagged incorrectly.

The bigger concern is users down stream finetuning models on their own datasets with this material. This has been happening for a while, though I won’t point fingers(Japan).

There’s not a whole lot that can be done about it but I also don’t think there’s anything that needs to be done. It’s already illegal and it’s already removed from most platforms semiautomatically. Having more of it won’t change that.

Thorny_Insight@lemm.ee on 26 Oct 2023 06:00 collapse

AI can generate a picture of astronaut riding a horse on the moon. It wasn’t trained on pictures of astronauts riding horses on the moon though.

really fucking weird I’m getting down voted for being against AI training on child porn

Because you made that up. It’s not happening.

BarrierWithAshes@kbin.social on 26 Oct 2023 00:26 collapse

Same thing is gonna happen (if not already) with animal abuse videos and images. Silver lining is that at least no actual animals are getting hurt but still. Grim.