Girl, 15, speaks out after classmate made deepfake nudes of her and posted online (www.independent.co.uk)
from dvdnet62@feddit.nl to technology@lemmy.ml on 23 Jun 2024 00:52
https://feddit.nl/post/17052139

#technology

threaded - newest

autotldr@lemmings.world on 23 Jun 2024 00:55 next collapse

This is the best summary I could come up with:


The teen’s phone was flooded with calls and texts telling her that someone had shared fake nude images of her on Snapchat and other social media platforms.

Berry, now 15, is calling on lawmakers to write criminal penalties into law for perpetrators to protect future victims of deepfake images.

“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.

The mom and daughter say legislation is essential to protecting future victims, and could have meant more serious consequences for the classmate who shared the deep-fakes.

“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail…so he would get a punishment for what he actually did,” McAdams told CNN.

“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said.


The original article contains 585 words, the summary contains 205 words. Saved 65%. I’m a bot and I’m open source!

[deleted] on 23 Jun 2024 02:08 collapse

.

Snowclone@lemmy.world on 23 Jun 2024 01:12 next collapse

That’s all well and good to remove them, but it solves nothing. At this point every easily accessible AI I’m aware of is kicking back any prompts with the names of real life people, they’re already antisipating real laws, preventing the images from being made in the first place isn’t impossible.

tailiat@lemmy.ml on 23 Jun 2024 01:35 next collapse

How?

Snowclone@lemmy.world on 23 Jun 2024 02:05 collapse

If you really must, you can simply have the AI auto delete nsfw images, several already do this. Now to argue you can’t simply never generate or give out nsfw images, you can also gate nsfw content generation behind any number of hinderences that are highly effective against anonymous use, or underage use.

i_am_not_a_robot@discuss.tchncs.de on 23 Jun 2024 02:29 collapse

Modern AI is not capable of this. The accuracy for detecting nsfw content is not good, and they are completely incapable of detecting when nsfw content is allowable because they have no morals and they don’t understand anything about people or situations besides appearance.

jacksilver@lemmy.world on 23 Jun 2024 03:39 collapse

These models are also already open sourced, you can’t stop it. Additionally all of those “requirements” would just mean that AI would only be owned by big corporations.

reev@sh.itjust.works on 23 Jun 2024 07:15 collapse

I think that’s one thing that’s often forgotten in these conversations. The cat’s out of the bag, you will never be able to stop the generation of things as they are right now. You’ll just be able to punish people for doing so.

scrubbles@poptalk.scrubbles.tech on 23 Jun 2024 01:38 next collapse

Agreed. To me, making them is one thing, it’s like making a drawing at home. Is it moral? Not really. Should it be illegal? I don’t think so.

Now, what this kid did, distributing them? Absolutely not okay. At that point it’s not private, and you could hurt their own reputation.

This of course ignores the whole fact that she’s underage, which is on its own wrong. AI generated csam is still csam.

bane_killgrind@slrpnk.net on 23 Jun 2024 01:57 next collapse

Reputation matters less than harassment. If these people were describing her body publicly it would be a similar attack.

DashboTreeFrog@discuss.online on 23 Jun 2024 02:06 next collapse

A friend in high school made nude drawings of another mutual friend. It was weird he showed me but he was generally an artsy guy and I knew he was REALLY into this girl and it was kind of in the context of showing he his art work. I reconnected with the girl years later and talked about this and while she said it was weird she didn’t really think much of it. Rather, the creepy part to her was that he showed people.

I don’t think we can stop horny teens from making horny content about their classmates, heck, I know multiple girls who wrote erotic stories featuring classmates. The sharing (and realism) is what turns the creepy but kind of understandable teenage behavior into something we need to deal with

jacksilver@lemmy.world on 23 Jun 2024 03:37 collapse

All I’m hearing is jailtime for Tina Belcher and her erotic friend fiction!

But seriously, i generally agree that as long as people aren’t sharing it shouldn’t be a problem. If I can picture it in my head without consequence, seems kinda silly putting that thought on paper/screen should be illegal.

scrubbles@poptalk.scrubbles.tech on 23 Jun 2024 03:56 collapse

Exactly, and it begs the question too, where’s the line? If you draw a stick figure of your crush with boobs is that a crime? Is it when you put an arrow and write her name next to it? AI just makes that more realistic, but it’s the same basic premise.

Distributing it is where it crosses a hard line and becomes something that should not be encouraged.

retrospectology@lemmy.world on 23 Jun 2024 10:28 collapse

It’s not some slippery slope to prohibit people generating sexual imagary of real people without their consent. The fuck is wrong with AI supporters?

Even if you’re a “horny teenager” making fake porn of someone is fucking weird and not normal or healthy.

Fubarberry@sopuli.xyz on 23 Jun 2024 02:50 next collapse

AI generated csam is still csam.

Idk, with real people the determination on if someone is underage is based on their age and not their physical appearance. There are people who look unnaturally young that could legally do porn, and underage people who look much older but aren’t allowed. It’s not about their appearance, but how old they are.

With drawn or AI-generated CSAM, how would you draw that line of what’s fine and what’s a major crime with lifelong repercussions? There’s not an actual age to use, the images aren’t real, so how do you determine the legal age? Do you do a physical developmental point scale and pick a value that’s developed enough? Do you have a committee where they just say “yeah, looks kinda young to me” and convict someone for child pornography?

To be clear I’m not trying to defend these people, but it seems like trying to determine what counts legal/non-legal for fake images seems like a legal nightmare. I’m sure there are cases where this would be more clear cut (if they ai generate with a specific age, trying to do deep fakes of a specific person, etc), but a lot of it seems really murky when you try to imagine how to actually prosecute over it.

scrubbles@poptalk.scrubbles.tech on 23 Jun 2024 03:54 collapse

all good points, and I’ll for sure say that I’m not qualified enough to be able to answer that. I also don’t think politicians or moms groups or anyone are.

All I’ll do is muddy the waters more. We as the vast majority of humanity think CSAM is sick, and those who consume it are not healthy. I’ve read that psychologists are split. Some think AI generated CSAM is bad, illegal, and only makes those who consume it worse. Others, however, suggest that it may actually curb urges, and ask why not let them generate it, it might actually reduce real children from being actually harmed.

I personally have no idea, and again am not qualified to answer those questions, but goddamn did AI really just barge in without us being ready for it. Fucking big tech again. “I’m sure society will figure it out”

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 10:44 collapse

AI generated csam is still csam.

This is a very dumb take that must never get any support. The extent of things politicians and elites get to do with such laws is batshit insanity.

bane_killgrind@slrpnk.net on 23 Jun 2024 01:44 next collapse

How do you deal with paint-in generation then

Snowclone@lemmy.world on 23 Jun 2024 02:15 collapse

The current method is auto deleting nsfw images. Doesn’t matter how you got there, it detects nsfw it dumps it, you never get an image. Besides that gating nsfw content generation behind a pay wall or ID wall. It would stop a lot of teenagers. Not all, but it would put a dent in it. There are also AI models that will allow some nsfw if it’s clearly in an artistic style, like a watercolor painting, but will kick nsfw realism or photography, rendered images, that sort of thing. These are usually both in the prompt mode, paint in/out, and image reference mode, generation of likely nsfw images, and after generating a nsfw check before delivering the image. AI services are antisipating full on legal consequences for allowing any nsfw or any realistic, photographic, cgi, image of a living person without their consent, it’s easy to see that’s what they are prepared for.

i_am_not_a_robot@discuss.tchncs.de on 23 Jun 2024 02:34 next collapse

Web services and AI in general are completely different things. Web services that generate AI content want to avoid scandals so they’re constantly blocking things that may be in some situations inappropriate, to the point where those services are incapable of performing a great many legitimate tasks.

Somebody running their own image generator on their own computer using the same technology is limited only by their own morals. They can train the generator on content that public services would not, and they are not constrained by prompt or output filters.

bane_killgrind@slrpnk.net on 23 Jun 2024 02:35 next collapse

Sure for some tools. There are other tools that don’t do that.

Chasing after the tools and services is a waste. Make harassment more clearly defined, go after people that victimize other people.

cryptiod137@lemmy.world on 23 Jun 2024 02:55 collapse

I assume someone who is currently generating AI porn is running a model locally and not using a service, as there is absolute boat loads of generated hentai getting pised every day?

retrospectology@lemmy.world on 23 Jun 2024 10:25 collapse

Depending on the AI developers to stop this on their own is a mistake. As is preemptively accepting child porn and deepfakes as inevitable rather than attempting to stop or mitigate it.

[deleted] on 23 Jun 2024 02:09 next collapse

.

GregorGizeh@lemmy.zip on 23 Jun 2024 02:17 next collapse

My personal belief still is that the prohibitive approach is futile and ultimately more harmful than the alternative: embrace the technology, promote it and create deepfakes of everyone.

Soon the taboo will be gone, the appeal as well, and everyone will have plausible deniability too, because if there are dozens of fake nudes of any given person then who is to say which are real, and why does it even matter at that point?

This would be a great opportunity to advance our societal values and morals beyond prudish notions, but instead we double down on them.

E: just to clarify I do not at all want to endorse creating nudity of minors here. Just point out that the girl in the article wouldn’t have to humiliate herself trying to do damage control in the above scenario, because it would be entirely unimportant.

Chozo@fedia.io on 23 Jun 2024 02:27 next collapse

While I think removing the stigma associated with having deepfakes made of you is important, I don't think that desensitization through exposure is the way to go about it. That will cause a lot of damage leading up to the point you're trying to reach.

wewbull@feddit.uk on 23 Jun 2024 12:17 next collapse

I don’t seen how else you do it.

“Removing the stigma” is desensitizing by definition. So you want to desensitize through… what? Education?

Chozo@fedia.io on 23 Jun 2024 12:19 collapse

I dunno, but preferably some method which doesn't involve a bunch of children committing suicide in the meantime.

Instigate@aussie.zone on 23 Jun 2024 13:04 collapse

As a child protection caseworker, I’m right here with you. The amount of children and young people I’m working with who are self-harming and experiencing suicidal ideation over this stuff is quite prevalent. Sadly, it’s almost all girls who are targeted by this and it’s just another way to push misogyny into the next generation. Desensitisation isn’t the way; it will absolutely cause too much harm before it equalises.

Mango@lemmy.world on 23 Jun 2024 19:01 collapse

Eve seen a deep fake nude of someone ugly? People make them because they wanna see you naked. Can’t see how that’s an insult.

jsomae@lemmy.ml on 23 Jun 2024 04:58 next collapse

This sounds like a cool idea because it is a novel approach, and it appeals to my general heuristic of the inevitability of technology and freedom. However, I don’t think it’s actually a good idea. People are entitled privacy, on this I hope we agree – and I believe this is because of something more fundamental: people are entitled dignity. If you think we’ll reach a point in this lifetime where it will be too commonplace to be a threat to someone’s dignity, I just don’t agree.

Not saying the solution is to ban the technology though.

fatalError@lemmy.sdf.org on 23 Jun 2024 07:56 collapse

When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them. If you aren’t expecting that, then you aren’t educated enough on how internet works and that’s what we should be working on. Social media is really bad for privacy and many people are not aware of it.

Now if someone took a picture of you and then edited it without your consent, that is a different action and it’s a lot more serious offense.

Either way, deepfakes are just an evolution of something that already existed before and isn’t going away anytime soon.

jsomae@lemmy.ml on 23 Jun 2024 08:07 collapse

Yeah I mean it’s just a more easy to use Photoshop basically.

I agree people need to understand better the privacy risks of social media.

When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them.

Expect, yeah I guess. Doesn’t mean we should tolerate it. I expect murder to happen on a daily basis. People editing images of me on their own devices and keeping that to themself, that’s their business. But if they edit photos of me and proliferate, I think it becomes my business. Fortunately, there are no photos of me on the internet.

Edit: I basically agree with you regarding text content. I’m not sure why I feel different about images of me. Maybe because it’s a fingerprint. I don’t mind so much people editing pictures I post that don’t include my face. Hmm.

wewbull@feddit.uk on 23 Jun 2024 12:13 collapse

Yeah I mean it’s just a more easy to use Photoshop basically.

Photoshop has the same technology baked into it now. Sure, it has “safeguards” so it may not generate nudes, but it would have no trouble depicting someone “having dinner with Bill Cosby” or whatever you feel is reputation destroying.

gaylord_fartmaster@lemmy.world on 23 Jun 2024 12:34 next collapse

Pretty sure they’re talking about generative AI created deepfakes being easier than manually cutting out someone’s face and pasting it on a photo of a naked person, not comparing Adobe’s AI to a different model.

jsomae@lemmy.ml on 23 Jun 2024 21:25 collapse

Ok then it’s a more easy to use GIMP.

PopOfAfrica@lemmy.world on 23 Jun 2024 05:46 next collapse

It’s also worth noting that too many people put out way too much imagery of themselves online. People have got to start expecting that anything you put out in the public domain becomes public domain.

madcaesar@lemmy.world on 23 Jun 2024 10:16 collapse

I second this motion. People also need to stop posting images of themselves all over the web. Especially their own kids. Parents plastering their kids images all over social media should not be condoned.

And on a related note we need much better sex-education in this country and a much healthier relationship with nudity.

GolfNovemberUniform@lemmy.ml on 23 Jun 2024 05:40 next collapse

This society is truly dead.

GolfNovemberUniform@lemmy.ml on 23 Jun 2024 05:46 next collapse

Using this idea will give minors feel of complete safety when doing crimes. I don’t think you have any sort of morals if you support it but it’s a question for your local law enforcements. The crime in question can seriously damage the mental health of the vuctim and be a reason for severe discrimination. Older minors should be responsible for their actions too.

Frokke@lemmings.world on 23 Jun 2024 07:34 collapse

You don’t turn 18 and magically discover your actions have consequences.

“Not a heavy crime”? I’ll introduce you to Sarah, Marie and Olivia. You can tell them it was just a joke. You can tell them the comments they’ve received as a result are just jokes. The catcalling, mentions that their nipples look awesome, that their pussies look nice, etc are just jokes. All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it’s not that bad? The fuck is wrong with you?

[deleted] on 23 Jun 2024 08:09 next collapse

.

Diurnambule@jlai.lu on 23 Jun 2024 08:39 next collapse

You right his parents have to be punished. They didn’t teach him how to respect other properly.

[deleted] on 23 Jun 2024 08:49 collapse

.

Diurnambule@jlai.lu on 23 Jun 2024 09:54 collapse

I agree with you. I was thinking of something like paying moving fee

[deleted] on 23 Jun 2024 09:19 next collapse

.

[deleted] on 23 Jun 2024 09:39 collapse

.

[deleted] on 23 Jun 2024 10:00 collapse

.

Evotech@lemmy.world on 23 Jun 2024 13:10 next collapse

You need to pick an age as the “magical day” anyway. Not really a good argument

[deleted] on 23 Jun 2024 13:48 collapse

.

EatATaco@lemm.ee on 23 Jun 2024 16:59 next collapse

The human mind doesn’t even really fully mature until your mid 20s. A 15 year old still has a good full decade until full maturity, and they are notorious for making impulsive decisions without realizing the consequences of their actions.

What he did was wrong and he deserves punishment, but ruining his life too for being a dumb teenager does nothing for the unimaginable harm caused to this girl, it just makes more victims.

I don’t know what the right answer is, but I can tell you the wrong answer is to ruin a teenagers life over a stupid act when that isn’t going to solve anything.

[deleted] on 23 Jun 2024 17:27 collapse

.

bloodfart@lemmy.ml on 23 Jun 2024 17:45 next collapse

retributive justice doesn’t work.

one of the main reasons people try to treat minors differently than adults is because they recognize that retributive justice is literally giving up on the person and doing the easiest thing for society to deal with them.

especially in cases that involve minors there’s a push for restorative, transformational and participatory justice models because they don’t give up and fall back on treating the person like an animal.

EatATaco@lemm.ee on 23 Jun 2024 19:10 collapse

A kid was arrested, but released pending further investigation, so I’m hard pressed to believe there is no punishment for this. But we’re talking about teenagers here, the fact that he could be punished is there, but was not given serious consideration if any at all…because he isn’t a fully mature adult. So what would a more serious punishment do?

This is something probably solved with education rather than more punishment.

[deleted] on 24 Jun 2024 06:25 collapse

.

EatATaco@lemm.ee on 24 Jun 2024 11:27 collapse

This is one long strawmen: you’re generalizing my argument for this single situation to every situation.

You’re basically accusing me of doing what you’re doing: thinking in black and white. In my case if I think that ruining his life here with severe punishment is wrong, it must always be wrong.

Ask yourself this. Is there anyone who did something very stupid in HS that turned out to be a good adult without facing severe consequences for their actions? I can think of a few.

[deleted] on 25 Jun 2024 07:06 collapse

.

[deleted] on 23 Jun 2024 17:28 collapse

.

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:22 collapse

Not when you ruin someone else’s life.

we are literally talking about an image that was made out of thin air, the description of “ruining someones life” is fucking absurd considering the very real alternative in this case.

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:21 collapse

I don’t think maturity is an explicit thing in a binary form, i would be ok with the presumption that the age of 18 provides a general expected range of maturity between individuals, it’s when you start to develop your world view and really pick up on the smaller things in life and how they work together to make a functional system.

I think the idea of putting a “line” on it, is wrong, i think it’s better to describe it “this is generally what you expect from this subset”

suburban_hillbilly@lemmy.ml on 23 Jun 2024 15:59 next collapse

Perhaps at least a small portion of the blame for what these girls are going through should be laid upon the society which obstinately teaches that a woman’s worth as a person is so inextricably tied to her willingness and ability to maintain the privacy of her areolas and vulva that the mere appearance of having failed in the endeavour is treated as a valid reason to disregard her humanity.

[deleted] on 23 Jun 2024 16:49 collapse

.

EatATaco@lemm.ee on 23 Jun 2024 16:55 next collapse

Ultimately I’m not sure where I fall on this issue, but the fact that you just mindlessly claimed that this person wants to see tits and clits, when they said nothing of the sort, just exposes how fully you realize you can’t defend an actual position.

[deleted] on 23 Jun 2024 17:24 collapse

.

suburban_hillbilly@lemmy.ml on 23 Jun 2024 19:01 next collapse

I don’t know how common it is to argue that women and girls should be treated as though they have worth and dignity regardless of their sexual proclivities and discretion, but it should be more common than it seems to be.

As for your assertion that holding this belief somehow betrays pedophilic sympathies - I have to admit, I don’t follow. Although I will say whether the literacy failure in this argument is mine or yours I am content to leave as an exercise to our readers.

[deleted] on 24 Jun 2024 05:57 collapse

.

suburban_hillbilly@lemmy.ml on 24 Jun 2024 10:28 collapse

There is no contradicton in believing that collectively shaming people who have had porn of them made is wrong and that making nonconsensual porn of people is wrong. Both are wrong. At no point did I say otherwise.

[deleted] on 25 Jun 2024 07:19 collapse

.

suburban_hillbilly@lemmy.ml on 25 Jun 2024 10:12 collapse

Women and girls should show us the goods so we don’t get so horny.

Yeah, nobody said anything even close to this. Cute trolling.

EatATaco@lemm.ee on 23 Jun 2024 19:12 next collapse

Common or not, they did not make the argument. You presumed a position and then used that made up position to launch an ad hominem.

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:18 collapse

People that want everyone to be OK with nudity and in most cases diddling kiddo’s. Same arguments, almost verbatim, have been used in the map-sphere.

you say this like they’re saying that children have to be naked in order to be outside legally. The point they were making is that the primary reason half of what you said was a significant concern is due explicitly to our current social climate and it’s values. While not fully relevant, they still made a point, and considering how bad your argumentative rhetoric is, i’d say it’s a fair shot at something you said, considering you didn’t have much else to say other than accusing someone of being a pedophile i guess.

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:15 collapse

We have been able to see faces since forever and people are still mocked for having faces that don’t fit the popular norms. Your argument is flawed.

i have vitiligo on my face, have yet to be mocked for it. People only ask about it respectfully.

People still have the right to privacy.

actually, no you don’t. Very few places have legal protections for privacy, both online, and physically, if you go outside in most states in the US you’re being trained on some sort of crime stopping AI dataset somewhere

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:13 collapse

All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it’s not that bad? The fuck is wrong with you?

and by the time they’re 18 and moving on to college, or whatever they’re probably busy not fucking worrying about whatever happened in high school, because at the end of the day you have two options here:

be a miserable fuck. try to be the least miserable fuck you can, and do something productive.

Generally people pick the second option.

And besides, at the end of the day, it’s literally not real, none of this exists. It’s better than having your nudes leaked. Should we execute children who spread nudes of other children now? That’s a far WORSE crime to be committing, because now that shit is just out there, and it’s almost definitely on the internet, AND IT’S REAL.

Seems to me like you’re unintentionally nullifying the consequences of actual real CSAM material here.

Is my comment a little silly and excessive? Yes, that was my point. It’s satire.

Lemongrab@lemmy.one on 24 Jun 2024 17:28 collapse

Victims of trauma dont just forget because time passes. They graduate (or dont) and move on in their lives, but the lingering effects of that traumatic experience shape the way the look at the worlds, whether they can trust, body disphoria, whether they can form long-lasting relationships, and other long last trauma responses. Time does not heal the wounds of trauma, they remain as scars that stay vulnerable forever (unless deliberate action is taken by the victim to dismantle the cognitive structure formed by the trauma event).

KillingTimeItself@lemmy.dbzer0.com on 25 Jun 2024 00:23 collapse

yeah but we’re also talking about something that quite literally never happened, it was all manufactured, and while i don’t want to downplay the effects of that.

This is probably the best time ever to start being an e slut because you can just say it was deep faked and people don’t exactly have a reason to disagree with you.

Also while trauma is permanent, i would also like to remind you that every life experience you have throughout your life is also permanent, it cannot be changed, it cannot be undone, it cannot be revoked. You simply have to live with it. The only thing that changes your experiences and memories around it, is how you handle it internally.

I would probably be more compassionate with you if we were literally talking about revenge porn, or whatever the correct stipulation would be here, i’m not sure, i don’t exactly fuck people on the regular so i’m not really qualified here lmao.

But like i said, this is just AI generated. Everyone knows about AI now, how many people do you think are going to hear that and go “yeah that makes sense” probably most of them. Highschoolers might be a bit more unreasonable, but nothing changes the fact that they simply aren’t real. You just have to do your best to dissociate yourself from that alternate reality where they are, because they quite literally, are not.

some people would consider it to be traumatic, others wouldn’t. I wouldn’t give a shit either way, i might even further the rumors because i think it would be funny. It’s all a matter of perspective.

i_am_not_a_robot@discuss.tchncs.de on 23 Jun 2024 02:25 next collapse

“This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.

“If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail…so he would get a punishment for what he actually did,” McAdams told CNN.

There’s a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.

“It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”

This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.

“[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim’s family asks for it,” Cruz said. “Elliston’s Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”

BS

It’s been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.

Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it’s extra illegal.

Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat’s rules and would have been taken down:

  • We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
  • We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
  • We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.
cryptiod137@lemmy.world on 23 Jun 2024 02:45 collapse

Is revenge porn illegal federally? Not that that would really matter, a state could still not have a law and have no way to prosecute it.

Given there was some state that recently passed a revenge porn law makes it clear your just wrong

On Snapchats ToS: Lucky never ran into the first point personally but as a teenager I heard about it happening quite a bit.

The second point is literally not enforced at all, to the point where they recommend some sort of private Snapchats which are literally just porn made by models

Don’t know how well they enforce the last point

i_am_not_a_robot@discuss.tchncs.de on 23 Jun 2024 04:13 collapse

I looked it up before posting. It’s illegal in 48 states, including California where most of these companies are headquartered, and every state where major cloud data centers are located. This makes it effectively illegal by state laws, which is the worst kind of illegal in the United States when operating a service at a national level because every state will have slightly different laws. No company is going to establish a system that allows users in the two remaining states to exchange revenge porn with each other except maybe a website established solely for that purpose. Certainly Snapchat would not.

I’ve noticed recently there are many reactionary laws to make illegal specific things that are already illegal or should already be illegal because of a more general law. We’d be much better off with a federal standardization of revenge porn laws than a federal law that specifically outlaws essentially the same thing but only when a specific technology is involved.

[deleted] on 23 Jun 2024 09:23 next collapse

.

Evotech@lemmy.world on 23 Jun 2024 13:09 next collapse

It’s not been reported on much because it doesn’t work that well. It’s not as easy as they want you to believe it is. I’m pretty sure most of the “promotional material” has been photoshopped or cherry picked at best

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:09 collapse

It’s not as easy as they want you to believe it is. I’m pretty sure most of the “promotional material” has been photoshopped or cherry picked at best

absolutely, all of the material out there for marketing is digitally manipulated by a human to some degree. And if it isn’t then honestly, i don’t know what you’re using AI image generation for lmao.

Spedwell@lemmy.world on 23 Jun 2024 13:19 collapse

404media is doing excellent work on tracking the non-consentual porn market and technology. Unfortunately, you don’t really see the larger, more mainstream outlets giving it the same attention beyond its effect on Taylor Swift.

deFrisselle@lemmy.sdf.org on 23 Jun 2024 09:38 next collapse

Odd that there is no mention of the parents contacting the police and working through them to get the images down Technically and legally the photos would be considered child porn Since it’s over the Internet it would bring Federal charges even though there maybe State charges Somethings were handled wrong if all the kid is getting is probation

wewbull@feddit.uk on 23 Jun 2024 12:00 next collapse

Technically and legally the photos would be considered child porn

I don’t think that has been tested in court. It would be a reasonable legal argument to say that the image isn’t a photo of anyone. It doesn’t depict reality, so it can’t depict anyone.

I think at best you can argue it’s a form of photo manipulation, and the intent is to create a false impression about someone. A form of image based libel, but I don’t think that’s currently a legal concept. It’s also a concept where you would have to protect works of fiction otherwise you’ve just made the visual effects industry illegal if you’re not careful.

In fact, that raises an interesting simily. We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked. We allow images of human physical abuse as long as they are faked. Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them. The resulting “works of art” are not under such limitations as far as I’m aware.

What’s the line here? Parental consent? I think that could lead to some very concerning outcomes. We all know abusive parents exist.

I say all of this, not because I want to defend anyone, but because I think we’re about to set some really bad legal precidents if we’re not careful. Ones that will potentially do a lot of harm. Personally, I don’t think the concept of any image, or any other piece of data, being illegal holds water. Police people’s actions, not data.

todd_bonzalez@lemm.ee on 23 Jun 2024 12:44 collapse

I don’t think that has been tested in court.

It has and it continues to be.

And even if it hadn’t, that’s no excuse not to start.

It would be a reasonable legal argument to say that the image isn’t a photo of anyone. It doesn’t depict reality, so it can’t depict anyone.

It depicts a real child and was distributed intentionally because of who it depicts. Find me then legal definition of pornography that demands that pornography be a “depiction of reality”. Where do you draw the line with such a qualifier?

I think at best you can argue it’s a form of photo manipulation, and the intent is to create a false impression about someone.

It is by definition “photo manipulation”, but the intent is to sexually exploit a child against her will. If you want to argue that this counts as a legal form of free speech (as libel is, FYI), you can fuck right on off with that.

A form of image based libel, but I don’t think that’s currently a legal concept.

Maybe actually know something about the law before you do all this “thinking”.

It’s also a concept where you would have to protect works of fiction otherwise you’ve just made the visual effects industry illegal if you’re not careful.

Oh no, not the sLiPpErY sLoPe!!!

We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked.

Little girls are the same as animals, excellent take. /s

Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them.

What kind of horror films are you watching that has naked children in sexual situations?

What’s the line here?

Don’t sexually exploit children.

Parental consent?

What the living fuck? Parental consent to make porn of their kids? This is insane.

I say all of this, not because I want to defend anyone, but because I think we’re about to set some really bad legal precidents if we’re not careful.

The bad legal precedent of banning the creation and distribution of child pornography depicting identifiable minors?

Personally, I don’t think the concept of any image, or any other piece of data, being illegal holds water.

Somebody check this guy’s hard drive…

[deleted] on 23 Jun 2024 20:38 collapse

.

suburban_hillbilly@lemmy.ml on 23 Jun 2024 13:16 next collapse

photos

They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

There isn’t any actual private information about the girls being disclosed. The algorithms, for example, do not and could not know about and produce an unseen birthmark, mole, tattoo, piercing, etc. A photograph would have that information. What is being shown is an approxomation of what similar looking girls in the training set look like, with the girls’ faces stiched on top. That is categorically different than something like revenge porn which is purely private information specific to the individual.

I’m sure it doesn’t feel all that different to the girls in the pics, or to the boys looking at it for that matter. There is some degree of harm here without question. But we must tread lightly because there is real danger in categorizing algorithmic guesswork as reliable which many authoritarian types are desperate to do.

wired.com/…/parabon-nanolabs-dna-face-models-poli…

This is the other side of the same coin. We cannot start treating the output of neural networks as facts. These are error prone black-boxes and that fact must be driven hard into the consciousness of every living person.

For some, I’m sure purely unrelated reason, I feel like reading Phillip K Dick again…

daellat@lemmy.world on 23 Jun 2024 18:13 next collapse

I’ve only read do androids dream of electric sheep by him, what other book(s) should I check out by him?

Rai@lemmy.dbzer0.com on 24 Jun 2024 00:28 collapse

Androids/sheep was so good

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:08 next collapse

They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

most phone cameras alter the original image with AI shit now, it’s really common, they apply all kinds of weird correction to make it look better. Plus if it’s social media there’s probably a filter somewhere in there. At what point does this become the ship of thesseus?

my point here, is that if we’re arguing that AI images are semantically, not photos, than most photos on the internet including people would also arguably, not be photos to some degree.

suburban_hillbilly@lemmy.ml on 24 Jun 2024 01:39 collapse

The difference is that a manipulated photo starts with a photo. It actually contains recorded information about the subject. Deepfakes do not contain any recorded information about the subject unless that subject is also in the training set.

Yes it is semantics, it’s the reason why we have different words for photography and drawing and they are not interchangeable.

Rekorse@lemmy.dbzer0.com on 24 Jun 2024 11:47 next collapse

The deepfakes would contain the prompt image provided by the creator. They did not create a whole new approximation of their face as the entire pool it can pull on for that specific part is a single or group of images provided by the prompter.

KillingTimeItself@lemmy.dbzer0.com on 25 Jun 2024 00:36 collapse

yeah idk why they said that, it’s objectively wrong.

KillingTimeItself@lemmy.dbzer0.com on 25 Jun 2024 00:36 collapse

Deepfakes do not contain any recorded information about the subject unless that subject is also in the training set.

this is explicitly, untrue, they literally do. You are just factually wrong about this. While it may not be in the training data, how do you think it manages to replace the face of someone in one picture, with the face of someone else in some other video.

Do you think it just magically guesses? No, it literally uses a real picture of someone. In fact, back in the day with ganimation and early deepfake software, you literally had to train these AIs on pictures of the person you wanted it to do a faceswap on. Remember all those singing deepfakes that were super popular back a couple of years ago? Yep, those literally trained on real pictures.

Regardless, you are still ignoring my point. My question here was how do we consider AI content to be “not photo” but consider photos manipulated numerous times, through numerous different processes, which are quite literally, not the original photo, and a literal “photo” to rephrase it simpler for you, and other readers. “why is ai generated content not considered to be a photo, when a heavily altered photo of something that vaugely resembles it’s original photo in most aspects, is considered to be a photo”

You seem to have missed the entire point of my question entirely. And simply said something wrong instead.

Yes it is semantics

no, it’s not, this is a ship of thesseus premise here. The semantics results in how we contextualize and conceptualize things into word form. The problem is not semantics (they are just used to convey the problem at hand), the problem is a philosophical conundrum that has existed for thousands of years.

in fact, if we’re going by semantics here, technically photograph is rather broad as it literally just defines itself as “something in likeness of” though it defines it as taken by method of photography. We could arguably remove that part of it, and simply use it to refer to something that is a likeness of something else. And we see this is contextual usage of words, a “photographic” copy is often used to describe something that is similar enough to something else, that in terms of a photograph, they appear to be the same thing.

Think about scanning a paper document, that would be a photographic copy of some physical item. While it is literally taken via means of photography. In a contextual and semantic sense, it just refers to the fact that the digital copy is photographically equivalent to the physical copy.

suburban_hillbilly@lemmy.ml on 25 Jun 2024 03:18 collapse

Oh FFS, I clipped the word new. Of course it uses information in the prompt. That’s trivial. No one cares about it returning the information that was given to it in the prompt. Nevertheless, mea culpa. You got me.

this is a ship of thesseus premise here

No, it really isn’t.

The pupose of that paradox is that you unambiguously are recreating/replacing the ship exactly as you already know it is. The reason the ‘ai’ in question here is even being used is that it isn’t doing that. It’s giving you back much more than it was given.

The comparison would be if Thesues’ ship had been lost and you definitely don’t have the ship anymore, but had managed to recover the sail. If you take the sail to an experienced builder (the ai) who had never seen the ship, then he might be able to build a reasonable approximation based on inferences from the sail and his wealth of knowledge, but nobody is going to be daft enough to assert it is same ship. Does the wheel even have the same number of spokes? Does it have the same number of oars? The same weight of anchor?

The only way you could even tell if his attempted fascimile was close is if you had already intimate knowledge of the ship from some other source.

…when a heavily altered photo of something that vaugely resembles it’s original photo in most aspects, is considered to be a photo”

Disagree.

KillingTimeItself@lemmy.dbzer0.com on 26 Jun 2024 00:37 collapse

No, it really isn’t.

i would consider it such, you said as much in your original post that the entire crux of the issue is the semantics between a real photograph, as physically taken by the camera, and what could be considered an image, whatever that constitutes, for purposes of semantical arguments here, let’s say digitally drawn art, clip art, whatever doesn’t matter. It’s objectively not a photo, and that’s what matters here.

The pupose of that paradox is that you unambiguously are recreating/replacing the ship exactly as you already know it is. The reason the ‘ai’ in question here is even being used is that it isn’t doing that. It’s giving you back much more than it was given.

Yeah so the reason why the thought experiment does this is because it creates an incredibly sterile environment which allows us to easily study and research the question at hand. In this case it’s to boil it down to something as physically close to “objective relation” and “symbolic relation” I.E. the two extremes of the thought experiment at hand. It’s still not easy to define what the true answer to the question is, and that’s why it’s incredibly sterile.

The comparison would be if Thesues’ ship had been lost and you definitely don’t have the ship anymore, but had managed to recover the sail. If you take the sail to an experienced builder (the ai) who had never seen the ship, then he might be able to build a reasonable approximation based on inferences from the sail and his wealth of knowledge, but nobody is going to be daft enough to assert it is same ship. Does the wheel even have the same number of spokes? Does it have the same number of oars? The same weight of anchor?

this is not what i was making my statement about. If you read my original comment you might pickup on this one.

Disagree.

yes ok, and this is what my thought experiment comparison was about in this case. The specific thing i was asking you was how we define a photo, and how we define an image, because what would normally be constituted as a photo, could arguably be considered to be an image on account of the various levels of image manipulation taking place.

While rather nitpicky in essence i suppose, the point i’m making here was that your entire statement might be upended entirely based on the fact that the original photo used, may not even be a photo at all, making the entire distinction entirely redundant to begin with. Since you never defined what counts as a “photo” and what counts as an “image” there is no clear distinction between that, other than the assumed AI image manipulation that you talked about. Which like i said, most phones do.

In short, i don’t think it’s a very good way of conceptualizing the fundamental problem here because it’s rather loose in it’s requirements. If you wanted to argue that the resulting imagery simply is not akin to actual real imagery (in a literal sense), i see no reason to disagree. However, unfortunately the general populous does not care about the semantic definition of whether or not an image is a photo or not. So as far as most people are concerned, it’s either “deep faked” or “real” There is no alternative.

Legally, since we’d be talking about revenge porn and CP here, i don’t see a reason to differentiate between the semantics, because as far as the law is concerned, and as far as most of the general public is concerned. Someone deep faking revenge porn is arguably, still just revenge porn. While AI generated CP may not be real CP, marrying a 12 year old is legal in some places, it’d still be fucking weird if you did it. If you are creating AI CP, that’s pretty fucking weird, and there isn’t exactly a good argument for doing that. (ignoring the one obvious counter example)

Xylogx@lemmy.ml on 29 Jun 2024 13:52 collapse

Whether or not you consider them photos, DOJ considers them child porn and you will still go to jail.

[deleted] on 23 Jun 2024 13:32 collapse

.

sunbather@beehaw.org on 23 Jun 2024 10:39 next collapse

society has become so used to girls and women being considered less that there is a scary amount of rationalization as to why its fine actually to completely annihilate all remaining bodily autonomy they have left. this is an explosion in suicides of young girls and adult women alike begging to happen. wake the fuck up.

whodoctor11@lemmy.ml on 27 Jun 2024 14:20 collapse

wtf why this comment have 7 downvotes

delirious_owl@discuss.online on 23 Jun 2024 13:31 next collapse

Is it CSAM if it was produced by AI?

Nyoka@lemm.ee on 23 Jun 2024 13:35 next collapse

In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don’t need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.

[deleted] on 23 Jun 2024 13:47 next collapse

.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 14:48 collapse

Reporting and getting my comment removed for feeling the hypothetical threat of becoming a CSAM planting victim? Wow, I think I struck the chord with you. It makes sense, people like you never think through things before suggesting them. Such people should never get the tiniest sliver of power.

papertowels@lemmy.one on 23 Jun 2024 15:08 next collapse

Nothing about your comment addressed why it should be treated differently if it’s ai-generated but visually indistinguishable.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 16:13 collapse

There is not yet AI that can do this. Also, is there real world harm happening? This is a problem of defamation and libel, not “CSAM”. Reducing problems to absurdity is lethal to liberty of citizens.

All those who wanted AI so much, you will have the whole cake now. Fuck AI empowerment. I knew this would happen, but the people glazing AI would not stop. Enjoy this brainrot, and soon a flood of Sora AI generated 720p deep fake porn/gore/murder videos.

papertowels@lemmy.one on 23 Jun 2024 18:00 collapse

Just passing through, no strong opinions on the matter nor is it something I wish to do deep dive research on.

Just wanted to point out that your original comment was indeed just a threat that did nothing to address OPs argument.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 18:58 collapse

It was not a threat, but a hypothetical example to gauge the reaction of that reactionary baiter.

The problem with claiming AI generated art as CSAM is that there is no possible way to create an objective definition of what “level” of realism is real and what is not. A drawing or imaginary creation is best left not defined as real in any capacity whatsoever. If it is drawn or digitally created, it is not real, period. Those people thinking of good uses of AI were too optimistic and failed to account for the extremely bad use cases that will spiral out of control as far as human society goes.

Even though China is incredibly advanced and proactive on trying to control this AI deepfake issue, I do not trust any entity in any capacity on such a problem impossible to solve on a country or international scale.

I just had a dejavu moment typing this comment, and I have no idea why.

Zoot@reddthat.com on 23 Jun 2024 20:34 collapse

Dude, it depicts a child in a sexual way. Find some other way to defend Loli’s then trying to say “The terms aren’t right, really its just libel” fuck outta here. Child, depicted in a sexual way -> CSAM. Doesn’t matter if it was drawn, produced, or photographed.

magi@lemmy.blahaj.zone on 23 Jun 2024 20:58 next collapse

It is very clear that they produce and/or consume said material and feel threatened by anyone calling it what it is

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 22:26 collapse

So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?

magi@lemmy.blahaj.zone on 23 Jun 2024 23:33 next collapse

In what world does that justify creating PHOTOREALISTIC sexual imagery of a REAL child? You’re out of your mind, royally.

ssj2marx@lemmy.ml on 24 Jun 2024 06:51 collapse

Hot take: yes. All art exists in a social context, and if the social context of your art is “this is a child and they are sexualized” then your art should be considered CSAM. Doesn’t matter if it’s in an anime style, a photorealistic style, or if it’s a movie where the children are fully clothed for the duration but are sexualized by the director as in Cuties - CSAM, CSAM, CSAM.

TheAnonymouseJoker@lemmy.ml on 24 Jun 2024 06:59 collapse

Glad that it will always remain a hot take.

The problem with your argument is there cannot be developed a scale or spectrum to judge where the fake stops and real starts for drawings or AI generated media. And since they were not recorded with a camera in real world, they cannot be real, no matter what your emotional response to such a deplorable defamation act may be. It is libel of an extreme order.

Cuties was shot with a camera in real world. Do you see the difference between AI generated media and what Cuties was?

ssj2marx@lemmy.ml on 24 Jun 2024 07:08 collapse

there cannot be developed a scale or spectrum to judge where the fake stops and real starts

Ah, but my definition didn’t at all rely on whether or not the images were “real” or “fake”, did it? An image is not merely an arrangement of pixels in a jpeg, you understand - an image has a social context that tells us what it is and why it was created. It doesn’t matter if there were real actors or not, if it’s an image of a child and it’s being sexualized, it should be considered CSAM.

And yes I understand that that will always be a subjective judgement with a grey area, but not every law needs to have a perfectly defined line where the legal becomes the illegal. A justice system should not be a computer program that simply runs the numbers and delivers an output.

TheAnonymouseJoker@lemmy.ml on 24 Jun 2024 07:22 collapse

An image is not merely an arrangement of pixels in a jpeg,

I am not one of those “it’s just pixels on a screen” people. But if it was not recorded in real world with a camera, it cannot be real.

Who will be the judge? If there is some automated AI created, who will be the one creating it? Will it be perfect? No. We will end up in the situation that Google caused to users, like doctors, married parents and legitimate people being labelled as pedophiles or CSAM users. It has already happened to me in this thread, and you also said it. The only accurate way to judge it will be a very large team of forensic experts on image/video media, which is not feasible for the amount of data social media generates.

not every law needs to have a perfectly defined line

And this is where the abuse by elites, politicians and establishment starts. Activists and dissidents can be easily jailed by CSAM being planted, which would in this case be as simple as AI pictures being temporary drive by downloads onto target’s devices.

ssj2marx@lemmy.ml on 24 Jun 2024 07:34 collapse

Who will be the judge?

The same people that should judge every criminal proceeding. Of course it’s not going to be perfect, but this is a case of not letting perfect be the enemy of good. Allowing generated or drawn images of sexualized children to exist has external costs to society in the form of normalizing the concept.

The argument that making generated or drawn CSAM illegal is bad because the feds might plant such images on an activist is incoherent. If you’re worried about that, why not worry that they’ll plant actual CSAM on your computer?

TheAnonymouseJoker@lemmy.ml on 24 Jun 2024 07:45 collapse

Have you considered the problem of doctors, married parents and other legitimate people being labelled as CSAM users and pedophiles? This has already happened, and they are not obligated to take the brunt of misjudgement of tools developed to judge such media. This is not a hypothetical scenario, and has already happened in real world, and has caused real world damage to people.

The argument of planted CSAM is not incoherent, but has also played out with many people. It is one of the favourite tools for elites and ruling politicians to use. I am less worried about it because such a law thankfully does not exist, that will misjudge the masses brutally for fictional media.

ssj2marx@lemmy.ml on 24 Jun 2024 08:01 collapse

How many times can I say “social context” before you grok it? There’s a difference between a picture taken by a doctor for medical reasons and one taken by a pedo as CSAM. If doctors and parents are being nailed to the cross for totally legitimate images then that strikes me as evidence that the law is too rigid and needs more flexibility, not the other way around.

TheAnonymouseJoker@lemmy.ml on 24 Jun 2024 08:34 collapse

If a pedophile creates a hospital/clinic room setting and photographs a naked kid, will it be okay? Do you understand these problems impossible to solve just like that? Parents also take photos of their kids, and they do not take photos like a doctor would. They take photos in more casual settings than a clinic. Would parents be considered pedophiles? According to the way you propose to judge, yes.

You are basically implying that social defamation is what matters here, and the trauma caused to victim of such fictional media is a problem. However, this is exactly what anti-AI people like me were trying to warn against. And since these models are open source and in public hands, the cat is out of the bag. Stable diffusion models work on potato computers and take atmost 2-5 minutes to generate per photo, and 4chan has entire guides for uncensored models. This problem will be 100x worse in a couple years, and 1000x worse in the next 5 years. And infinitely worse in a decade. Nothing can be done about it. This is what AI revolution is. Future generations of kids are fucked thanks to AI.

The best thing one can do is protect their privacy, and photos from being out there. Nobody can win this battle, and even in the most dystopian hellhole with maximum surveillance, there will be gaps.

todd_bonzalez@lemm.ee on 25 Jun 2024 01:08 collapse

These are some insane mental gymnastics.

Congratulations on the power trip purging every comment that calls you out.

frauddogg@lemmygrad.ml on 23 Jun 2024 15:15 collapse

Still on your fuckshit, I see. Smh.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 16:19 collapse

This is not a “CSAM” problem, since there is no physical outcome. This is a defamation and libel problem, and should be treated as such. If I see nonsensical notions, I will call them out without fear.

ssj2marx@lemmy.ml on 24 Jun 2024 06:47 collapse

Do you not consider photoshopping an actual person’s photos into porn abusive towards that person?

TheAnonymouseJoker@lemmy.ml on 24 Jun 2024 06:55 collapse

I consider it as defamation and libel. Yes, it is faux porn, but ultimately the goal is to harass and defame the person.

todd_bonzalez@lemm.ee on 23 Jun 2024 13:38 next collapse

Is it material that sexually abuses a child?

[deleted] on 23 Jun 2024 14:45 next collapse

.

todd_bonzalez@lemm.ee on 23 Jun 2024 15:51 collapse

So you don’t think that nudifying pics of kids is abusive?

Says something about you I think…

[deleted] on 23 Jun 2024 16:11 collapse

.

todd_bonzalez@lemm.ee on 23 Jun 2024 16:17 next collapse

drawings

Nobody said anything about drawings, but interesting default argument… Thanks for telling the class that you’re a lolicon pedo.

the liberty of masses be stomped and murdered

Nobody said that anyone should be stomped and murdered, so calm down, lmao. We’re just saying that child porn producers, consumers, and apologists are vile, disgusting perverts who should be held accountable for their crimes against children.

[deleted] on 23 Jun 2024 16:39 collapse

.

magi@lemmy.blahaj.zone on 23 Jun 2024 16:56 collapse

They’re making them unsafe? You and your bullshit are making them unsafe. Every comment you post reeks of your true character. Go get help.

[deleted] on 23 Jun 2024 17:44 next collapse

.

todd_bonzalez@lemm.ee on 23 Jun 2024 20:09 collapse

Do you really think being insufferable is going to change any minds here?

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 22:24 collapse

The only thing insufferable is reactionary people in this thread. It must be easy labelling people rather than calmly thinking about things.

todd_bonzalez@lemm.ee on 23 Jun 2024 22:28 collapse

You are absolutely not coming across as calm fwiw.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 22:40 collapse

Kindly provide the answer within two hours for false accusations on the other comment, since you are currently active. No further notice will be given.

todd_bonzalez@lemm.ee on 23 Jun 2024 20:15 collapse

I’m making kids unsafe by…

checks notes

…being firmly and unwaveringly against the sexual exploitation of children.

I really can’t stress enough that this was an actual 15 year old girl who was pornified with AI. This isn’t some “erotic drawings” argument, the end results were photorealistic nudes with her unmodified face. This isn’t some completely AI generated likeness. Pictures of her from social media were exploited to remove her clothes and fill in the gaps with models trained on porn. It was nonconsensual pornography of a kid.

Anyone who read this story, and feels the need to defend what was done to this girl is a fucking monster.

I can’t believe that the person defending sex crimes of this magnitude is a fucking mod.

[deleted] on 23 Jun 2024 22:32 collapse

.

magi@lemmy.blahaj.zone on 23 Jun 2024 22:57 collapse

You’re defending it by playing it down as simple defamation. Quit your bullshit and go get help.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 23:39 collapse

Since you are also labelling me as a pedophile/loli fan, I would prefer you provide evidence of the same. Failing to do so will require to take moderator actions.

Justifying your absurdity using hivemind baiting tactics may work on Reddit, but this is Lemmy.

Edit: I have learned my lesson. I will never be this tolerant again. Disgusting people. Leniency just makes you a doormat.

magi@lemmy.blahaj.zone on 23 Jun 2024 23:52 collapse

Refute literally anything I said? My post history in this thread, mostly replying to you, is still present. My absurdity? You’ll go down as the laughing stock you are right now. Only need to take a quick look around this thread to realize your view on this is unpopular. “this is Lemmy”, yeah, and apparently people like you still exist on the internet. You’re defending/playing down the production and consumption of photorealistic sexual imagery depicting a REAL underage girl.

magi@lemmy.blahaj.zone on 23 Jun 2024 16:53 next collapse

Found the weirdo

Zoot@reddthat.com on 23 Jun 2024 20:57 collapse

Found the Loli* ftfy

refalo@programming.dev on 23 Jun 2024 23:41 collapse

attack the argument, not the person

TheAnonymouseJoker@lemmy.ml on 24 Jun 2024 06:52 next collapse

I think I have been attacked far too much here already. These nasty people are labelling me as pedophilia supporters. I would suggest capital punishment for pedophiles and atleast a non bailable offence law for such defamation actors like the one in post article and these internet creatures that go around labelling people falsely.

Zoot@reddthat.com on 24 Jun 2024 16:19 collapse

Then don’t defend them? You’re trying to tell everyone that what is literally above in an article, about a child who had PHOTOREALISTIC pictures made of her, that it isn’t CSAM.

It is. Deleting everyone’s comments who disagree with you will not change that, and if anything, WILL make you seem even more like the bad guy.

Surreal@programming.dev on 24 Jun 2024 11:43 collapse

Practice what you preach. Read the thread again, what do you think “say something about you” mean?

delirious_owl@discuss.online on 23 Jun 2024 15:35 next collapse

Is it material that may encourage people to sexually abuse a child?

todd_bonzalez@lemm.ee on 23 Jun 2024 15:51 next collapse

That’s one definition, sure.

Now answer the very simple question I asked about whether or not child porn is abusive.

MehBlah@lemmy.world on 23 Jun 2024 20:13 collapse

Any sex act involving a adult and a child/minor is abusive by its very nature.

NotMyOldRedditName@lemmy.world on 23 Jun 2024 16:34 collapse

It’s actually not clear that viewing material leads that person to causing in person abuse

Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.

That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.

delirious_owl@discuss.online on 23 Jun 2024 20:00 collapse

There’s other instances where it was completely fabricated, and the courts ruled it was CSAM and convicted

NotMyOldRedditName@lemmy.world on 23 Jun 2024 20:55 collapse

There has been yes, but it doesn’t mean it’s the right ruling law. The law varies on that by jurisdiction as well because it is a murky area.

Edit: in the USA it might not even be illegal unless there was intent to distribute

By the statute’s own terms, the law does not make all fictional child pornography illegal, only that found to be obscene or lacking in serious value. The mere possession of said images is not a violation of the law unless it can be proven that they were transmitted through a common carrier, such as the mail or the Internet, transported across state lines, or of an amount that showed intent to distribute.[

So local AI generating fictional material that is not distributed may be okay federally in the USA.

delirious_owl@discuss.online on 24 Jun 2024 01:49 next collapse

Serious value? How does one legally argue that their AI-generated child porn stash has “serious value” so they they don’t get incarcerated.

Laws are weird.

NotMyOldRedditName@lemmy.world on 24 Jun 2024 03:12 collapse

Have the AI try to recreate existing CP already deemed to have serious value and then have all the prompts/variations leading up to the closest match as part of an exhibit.

Edit: I should add, don’t try this at home, they’ll still probably say it has no value and throw you in jail.

delirious_owl@discuss.online on 24 Jun 2024 03:30 collapse

Prison*

NotMyOldRedditName@lemmy.world on 24 Jun 2024 04:13 collapse

Ah my bad, you’re right.

Then you’ll probably get shanked if any of the other inmates find out you were sent there for CP.

delirious_owl@discuss.online on 24 Jun 2024 04:15 collapse

Hey bro, it was just AI-generated tho! And it had serious value!

Plz don’t stab me

[deleted] on 25 Jun 2024 07:19 collapse

.

HauntedCupcake@lemmy.world on 23 Jun 2024 15:47 next collapse

I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.

But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.

Apologies if it’s just my reading comprehension being shit

fine_sandy_bottom@discuss.tchncs.de on 23 Jun 2024 19:39 next collapse

Is that the definition of CSAM?

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:03 collapse

it would be material of and or containing child sexual abuse in it.

Majestic@lemmy.ml on 23 Jun 2024 22:09 next collapse

It should be considered illegal if it was used to harm/sexually abuse a child which in this case it was.

Whether it should be classed as CSAM or something separate, I tend to think probably something separate as a revenge porn type law that still allows for distinguishing between this and say a girl whose uncle groomed and sexually abused her while filming it as while this is awful it can (and often does seem) be the product of foolish youth rather than the offender and those involved all being very sick, dangerous, and actually violent offending adult pedophiles victimizing children.

Consider the following:

  1. Underage girl takes a picture of her own genitals, unfortunately classified as the unhelpful and harmful term “child porn” and she can be charged and registered as a sex offender but it’s not CSAM and -shouldn’t- be considered illegal material or a crime (though it is because the west has a vile fixation on puritanism which hurts survivors of childhood sexual trauma as well as adults).

  2. Underage girl takes a picture of her genitals and sends it to her boyfriend, again /shouldn’t/ be CSAM (unfortunately may be charged similarly), she consented and we can assume there wasn’t any unreasonable level of coercion. What it is unfortunately is bound by certain notions of puritanism that are very American.

  3. From 2, boyfriend shares it with other boys, now it’s potentially CSAM or at the least revenge porn of a child as she didn’t consent and it could be used to harm her but punishment has to be modulated with the fact the offender is likely a child himself and not fully able to comprehend his actions.

  4. Underage boy cuts out photo of underage girl he likes, only her face and head, glues it atop a picture of a naked porn actress, maybe a petite one and uses it for his own purposes in private. Not something I think should be classed as CSAM.

  5. Underage boy uses AI to do the same as above but more believably, again I think it’s kind of creepy but if he keeps it to himself and doesn’t show anyone or spread it around it’s just youthful weirdness though really he probably shouldn’t have easy access to those tools.

  6. Underage boy uses AI to do same as 4-5 but this time he spread it around, defaming the girl, she/her friends find out, people say mean things about her, she has to go to school with a bunch of people who are looking and pleasuring themselves to fake but realistic images of herself against her consent which is violating and makes one feel unsafe. Worse probably being bullied for it, mean things, called the s-word, etc.

Kids are weird and do dumb things though unfortunately boys especially in our culture have a propensity to do things that hurt girls far more than the inverse to the point it’s not even really worth talking about girls being creepy or sexually abusive towards peer-aged boys in adolescence and young adulthood. To address this though you need to address patriarchy and misogyny on a cultural level, teach boys empathy and respect for girls and women and frankly do away with all this abusive pornography that’s super prevalent and popular which encourages and perpetuates abusive actions and mentalities towards women and girls, this will never happen in the US however because it’s structurally opposed to being able to do such a thing. Also couldn’t hurt to peel back the stigma and shame around sexuality and nudity in the US which stems from its reactionary Christian culture but again I don’t think that will ever happen in the US as it exists, not this century anyways.

Obviously not getting into adults here as that doesn’t need to be discussed, it’s wrong plain and simple.

Bottom line I think is companies need to be strongly compelled to quickly remove revenge-porn type stuff (regardless of the age of the victim though children can’t deal with this kind of thing as well as adults so the risk of suicide or other self-harm is much higher so it should be treated as higher priority) which this definitely is. It’s abusive and unacceptable and they should fear the credit card companies coming down on them hard and destroying them if they don’t aggressively remove it and ban it and report those sharing it. It should be driven off the clear-web once reported, there should be an image-hash data-set like that used for CSAM (but separate) for such things and major services should use it to stop the spread.

deltapi@lemmy.world on 24 Jun 2024 03:38 collapse

I think it’s best to not defend kiddie porn, unless you have a republican senator in your pocket.

Majestic@lemmy.ml on 24 Jun 2024 04:06 collapse

Did you reply to the wrong person or do you just have reading comprehension issues?

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:01 collapse

i believe in the US for all intents and purposes, it is, especially if it was sourced from a minor, because you don’t really have an argument against that one.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 13:52 next collapse

Such actions should be judged not as CSAM but as defamation and libel. Anyone going around harping about AI CSAM does not care about empowering politicians and elites and will bootlick them forever happily. A drawing or AI generated media cannot be CSAM, because nobody is physically abused.

Zoot@reddthat.com on 23 Jun 2024 21:07 next collapse

Thanks for making it easy to tag you as a Loli Supporter. Ml has its problems, but hopefully harboring loli’s/pedo’s who get their kicks off of child like photos won’t be one of them.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 22:22 collapse

There are many pedo/loli Lemmy instances that are banned. You may like those, I do not. You might have that problem you are describing in such detail.

Mammothmothman@lemmy.ca on 23 Jun 2024 21:22 collapse

Go back to Q-an0n Diddler.

TheAnonymouseJoker@lemmy.ml on 23 Jun 2024 22:21 collapse

I see, it turned out to be useful to detect reactionary baiters. People like you are useful to the state, not to the kids you are pretending to protect.

Mammothmothman@lemmy.ca on 24 Jun 2024 02:16 collapse

Your “gubment bad” position is reactionary. Your inability or unwillingness to understand how an AI generated image of a naked body with a minor’s likeness superimposed on top of it is CSAM is telling of your true motivation. You are the type of person who reads 1984 and can’t do anything but identify with the main character, completely ignoring how dispicable and low that character is. The state is by no means perfect but its a whole lot better than the bullshit you are peddling. Eat Shit and die pedo apologist.

todd_bonzalez@lemm.ee on 24 Jun 2024 04:24 collapse

It really is a bizarre argument.

“The government bans child porn, but the government is bad, so child porn should be legal”

I feel like this person is starting with the conclusion, and justifying it with any narrative they can find that makes child porn free speech…

todd_bonzalez@lemm.ee on 24 Jun 2024 17:02 collapse

C’mon guy…

<img alt="" src="https://lemm.ee/pictrs/image/7d06b3f6-58a6-4721-8967-ac5ee9e0207e.png">

[deleted] on 23 Jun 2024 23:49 next collapse

.

whodoctor11@lemmy.ml on 23 Jun 2024 23:49 next collapse

<img alt="" src="https://img.ifunny.co/images/9cb3084065efd222f474ec8dc95d5f1f0e2fb96d5f08353aeb5c3db6ff2e45de_1.webp">

KillingTimeItself@lemmy.dbzer0.com on 24 Jun 2024 00:00 next collapse

why do i get the gut feeling that this is going to be an utter clusterfuck of a mess.

Hopefully i’m wrong.

31337@sh.itjust.works on 24 Jun 2024 00:42 next collapse

Wary of the bill. Seems like every bill involving stuff like this is either designed to erode privacy or for regulatory capture.

Edit: spelling

prole@sh.itjust.works on 24 Jun 2024 08:26 next collapse

And this makes you tired…?

winkerjadams@lemmy.dbzer0.com on 24 Jun 2024 11:28 next collapse

Yes. It’s very tiring having to constantly fight this battle. Unfortunately that’s what they want cause if enough of us are too tired to care then eventually it slips through and we never get back what we lost.

31337@sh.itjust.works on 24 Jun 2024 20:10 collapse

Lol, good catch.

ssj2marx@lemmy.ml on 24 Jun 2024 17:04 collapse

introducing the AI transparency act, which requires every generative prompt to be registered in a government database

scrubbles@poptalk.scrubbles.tech on 24 Jun 2024 18:04 next collapse

and that’s what I loathe about the idiots who are for this stuff. Yes, I want to curb this stuff - but for fuck’s sake there are ways to do it that aren’t “Give big government every scrap of data on you”.

There are ways to prove I’m over 18 without needing to register my ID with a porn company, or to regulate CSAM while not having to read private messages. Fuck, but we have the combination of circle of a venn diagram of idiot and control freak in congress, and they’ll happily remove all of our rights over some fear of the boogeyman

whodoctor11@lemmy.ml on 25 Jun 2024 03:54 collapse

I don’t see a problem with that, I think that this information should be public, both prompt and result, because:

  • a. The “AIs” companies already know that, why shouldn’t anyone else?
  • b. They use public information to train their models, thus their results should also be public.
  • c. This would be the ultimate way to know that something was “AI” generated.

This is a very different subject from giving acess for your DMs. The only ones who benefit from this information not being publicly available are those who use “AI” for malicious purposes, while everyone benefits from privacy of correspondence.

ssj2marx@lemmy.ml on 25 Jun 2024 05:12 collapse

I suppose you would also be fine with every one of your google searches being in a database? Every video you’ve ever watched, even the ones in private browser tabs?

whodoctor11@lemmy.ml on 25 Jun 2024 13:28 collapse

No, and that’s why I don’t use Google or anything that isn’t encrypted and sends any data that I consider private to some datacenter. And even when I know the data is encrypted, I am careful, as anyone should be, with data leaving your computer and going to someone else’s.

“AI” is not the same thing. Why would I want my prompt to be private if I don’t want to use the result in some malicious way, be it generating CSAM or using it to cheat someone to write an article, or to generate a Deep Fake video of someone for an internet scam?

ssj2marx@lemmy.ml on 25 Jun 2024 17:39 collapse

Why would I want my prompt to be private if I don’t want to use the result in some malicious way

Do you think that the only thing people use AI for is making deepfakes and CSAM? AFAIK the most common use is generating porn. Now, I don’t think generating regular porn is “malicious”, but I certainly understand why most people (self included) want to keep what they generate private.

whodoctor11@lemmy.ml on 26 Jun 2024 00:20 collapse

I don’t think people’s right to generate whatever image they want to jerk off to is fundamental or more important than avoiding “AI” scams and CSAM generation. There are other ways to jerk off: there’s plenty of real people porn online and also lots, lots, lots of hentai, for literally every taste. “AI” porn only has two particularities that are not satisfied by these two options, one is to generate the scene you want, and for the very remote possibility that what you have imagined has never been produced before, you can pay an artist to To do so, another is Deep Fake porn, which should be a crime, it doesn’t matter if you’re not going to publish the image.

BigMacHole@lemm.ee on 24 Jun 2024 03:49 next collapse

Wears the It’s Always Sunny In Philadelphia picture when you need it?

-Republicans trying to Protect Save do SOMETHING with Children!

dessalines@lemmy.ml on 24 Jun 2024 14:23 next collapse

I apologize for the innappropriate behavior and bans by @TheAnonymouseJoker@lemmy.ml in this thread, I’ve removed them as a mod here, banned them, and unbanned the ppl who they innappropriately banned.

Note: if they get unbanned in the near future, its because of our consensus procedure which requires us admins to take a vote.

accipiter_striatus@lemmy.wtf on 24 Jun 2024 16:58 next collapse

Thank you

They have another account on lemmygrad: lemmygrad.ml/u/TheAnonymouseJoker

Anyone around that knows the admins there so they take a look too?

ShiningWing@lemmygrad.ml on 25 Jun 2024 01:24 collapse

I let them know, they just took care of it 👍

Sunforged@lemmy.ml on 24 Jun 2024 19:33 collapse

Appreciate it.

boatsnhos931@lemmy.world on 24 Jun 2024 23:20 collapse

They should chop that kids head off for those pixel titters