Kids are making deepfakes of each other, and laws aren’t keeping up
(19thnews.org)
from Pro@reddthat.com to technology@lemmy.world on 02 Jul 11:13
https://reddthat.com/post/44868441
from Pro@reddthat.com to technology@lemmy.world on 02 Jul 11:13
https://reddthat.com/post/44868441
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
threaded - newest
The only defense is to train AI to draw guys with micropenises. As long as kids being kids is a defense for this shit (and to be fair, kids are pretty fucking stupid and need the freedom to grow out of that) rule makers have no power here. At least insofar as the AI to do this can be run locally on a potato.
I think the micropenis thing wold just encourage this further
It’s obviously not a serious suggestion, but the reality is the tools are out there and Pandora’s box can’t be put back on the shelf. Kids can’t be held accountable in a meaningful way. This is just an issue we are going to face basically forever now.
There is a window of time during which most kids are little sociopaths and you can’t appeal to any better nature. They have the means and often no internal or external restraint. And so mutually assured destruction is my tongue in cheek answer.
AI can do penises just fine though, there’s just no market demand for it so quick and easy deep fake sites are focused on female bodies.
But I disagree with this anyway, this will be the “bullied kid brings a knife to class” of AI.
You’re disagreeing with my unserious suggestion? I just… okay. No. Micropenises aren’t a solution. I just don’t think there is one.
If you want to disagree with that, let’s hear it. I have 15 and 13 year old daughters. Anyone can buy a $400 computer, install Linux, install AI, and undress people all day long. There is no legal restraint capable of stopping that, only punishing it.
Shut down model distribution and it’ll move to torrent. Put the kids in the legal system and they are going to face lifelong consequences for 12-year-old assholery. (To be fair, victims often face long repercussions for being targeted, but that’s not imposed by the state which demands a higher standard.) Hold parents accountable and it will disproportionately impact families who spend more hours working and can’t supervise their kids 24/7.
So I’m short on answers, but open to discussion.
They may be little sociopaths, but they don’t run around murdering each other. Our culture hammers it into their brains that murder is wrong and they will be severely punished for it. We need to build a culture where little boys are afraid to distribute naked photos of their classmates. Where their friends will turn them in for fear of repercussions. You do that by treating it like a crime, not by saying “boys will be boys” and ignoring the issue.
Treat it like a crime, and address separately the issue of children being tried as adults and facing lifelong consequences. The reforms needed to our juvenile justice system go beyond this particular crime.
Oh… There’s demand for it for sure.
The image generation models that exist are unquestionable proof of demand for penises. I think what’s missing is the kahunas required to make a business around it. There are places even pornographers fear to tread.
The laws to ban "AI", you mean?
Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money
Even in countries a lot less corrupt than the US this is an issue.
Especially because the US government/companies doesn’t do jack shit for people
Oh I just assumed that every Conservative jerks off to kids
Get some receipts and that will be a start.
Receipts you say?
We’re at 56 pages of this now for a nice round count of 1400 charges
So far as I am aware all of these are publicly searchable court cases
Alright, now we just need the main stream media to run the story.
I mean with all the zealotry against drag shows they should be ready to run with this one right?
You’d think so, right?
They want to be regulated so they can finally have their mote. Cutting out the states’ power does mean they will only have to buy one group of politicians in Washington and those are some relatively cheap Hoes
A 99-1 vote to drop the anti AI regulation is hardly the government voting against. The Senate smashed that shit hard and fast.
Expecting people to know about that 99-1 vote might be misplaced optimism, since it hasn’t been made into a meme yet.
especially that
AbbotTed Cruz, who brought this one up, voted against it in the end, which is pretty confusing for an european tbhe: i mean that it’s memeworthy lol
I’m confused - by Abbot do you mean Gov. Abbott of Texas, and are we talking about the same issue? Cuz the 99-1 vote was about a senate bill regarding AI. Greg Abbott can’t vote on senate bills, and there’s no senator named Abbot.
aaah i misremembered, it was Ted Cruz, oops :-D
arstechnica.com/…/ted-cruz-gives-up-on-ai-law-mor…
In the case of US govt, the AI part of the bill they voted against was the part that blocked regulations on AI for a period of 10 years.
In case that wasn’t clear, the US govt voted in favor of regulating AI. 99-1.
Honestly I think we need to understand that this is no different to sticking a photo of someone’s head on a porn magazine photo. It’s not real. It’s just less janky.
I would categorise it as sexual harassment, not abuse. Still serious, but a different level
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.
Disagree. Not CSAM when no abuse has taken place.
That’s my point.
If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?
If so, how is the psychological effect of a convincing deepfake any different?
Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.
It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.
How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?
It’s absolutely sexual harassment.
But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.
Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.
If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.
It also means that someone observed you when you were doing “secret” things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.
I would think it is very different. Unless you’re only thinking about the psychological effect on the viewer.
There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.
Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.
Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.
The intention to exploit.
I think generating and sharing sexually explicit images of a person without their consent is abuse.
That’s distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I’m morally uncomfortable criminalizing an act that has no victim.
Harassment sure, but not abuse.
Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child’s identity.
Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material… CSHAM, or maybe just CSAM, you know, to remember it more easily.
Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?
Drawing a sexy cartoon that looks like an adult, with a caption that says “I’m 12”, counts. So yeah, probably.
This actually is quite fuzzy and depends on your country and even jurisdiction in your country
I don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.
I would consider that as qualifying. Because it’s targeted harassment in a sexually-explicit manner. All the girl would have to do is claim it’s her.
Source: I’m a father of teenage daughters. I would pursue the individual(s) who started it and make them regret their choices.
I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.
That could be a socially healthy place to end up at. I don’t see it anytime soon though. Just look at the other response I got.
Anyone with half a brain will certainly claim as much. Even if people don’t fully believe it, it will blunt the most serious of social consequences.
Sure. That might end up being a socially healthy place for adults to end up.
But it will never work that way for young teens. Their brains aren’t done baking yet. They don’t have the emotional maturity to understand that enough to be “okay with it because it’s just a fake”.
That’s why we protect kids rather than just telling them “hey it’s okay…it’s only a fake.”
Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.
If the person in the image is underaged then it should be classified as child pornography. If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.
Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.
Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.
It’s bullying with a sexual element. The fact that it uses AI or deepfakes is secondary, just as it was secondary when it was photoshop, just as it was secondary when it was cutting out photos. It’s always about using it bully someone.
This is different because it’s easier. It’s not really different because it (can be) more realistic, because it was never about being realistic, otherwise blatantly unrealistic images wouldn’t have been used to do it. Indeed, the fact that it can be realistic will help blunt the impact of the leaking of real nudes.
It’s sexually objectifying the bodies of girls and turning them into shared sexual fantasies their male peers are engaging in. It is ABSOLUTELY different because it is more realistic. We are talking about entire deep fake porngraphy production and distribution groups IN THEIR OWN SCHOOLS. The amount of teenage boys cutting pictures out and photoshopping them was nowhere near as common as this is fast becoming and it was NOT the same as seeing a naked body algorithmically derived to appear as realistic as possible.
Can you stop trying to find a silver lining in the sexual exploitation of teenage girls? You clearly don’t understand the kinds of long term psychological harm that is caused by being exploited in this way. It was also exploitative and also fucked up when it was in photoshop, this many orders of magnitude more sophisticated and accessible.
Youre also wrong that this is about bullying. Its an introduction to girls being tools for male sexual gratification. It’s LITERALLY commodifiying teenage girls as sexual experiences and then sharing them in groups together. It’s criminal. The consent of the individual has been entirely erased. Dehumanization in its most direct form. It should be against the law and it should be prosecuted very seriously wherever it is found to occur.
Can you please use words by their meaning?
Also I’ll have to be blunt, but - every human has their own sexuality, with their own level of “drive”, so to say, and their dreams.
And it’s absolutely normal to dream of other people. Including sexually. Including those who don’t like you. Not only men do that, too. There are no thought crimes.
So talking about that being easier or harder you are not making any argument at all.
However. As I said elsewhere, the actions that really harm people should be classified legally and addressed. Like sharing such stuff. But not as making child pornography because it’s not, and not like sexual exploitation because it’s not.
It’s just that your few posts I’ve seen in this thread seem to say that certain kinds of thought should be illegal, and that’s absolute bullshit. And laws shouldn’t be made based on such emotions.
I don’t know where you’re getting this “thought crime” stuff. They’re talking about boys distributing deepfake nudes of their classmates. They’re not talking about individuals fantasizing in the privacy of their own homes. You have to read all of the words in the sentences, my friend.
“thought crime”? And you have the balls to talk about using words “by their meaning”?This is a solid action with a product to show for it, not a thought, which happens to impact someone’s life negatively without their consent, with potentially devastating consequences for the victim.
So, can you please use words by their meaning?Edit: I jumped the gun when I read “thought crime”, effectively disregarding the context. As such, I’m scratching the parts of my comment that don’t apply, and leaving the ones that do apply (not necessarily to the post I was replying to, but to the whole thread).
The author of those comments wrote a few times what in their opinion happens in the heads of others and how that should be prevented or something.
Can you please stop interpreting my words exactly the way you like? That’s not worth a gram of horse shit.
Yes I can, moreso after your clarification. I must have misread it the first time. Sorry.
Sorry for my tone too, I get dysphoric-defensive very easily (as have been illustrated).
If only we could all resolve our disputes like this every time, even after it got heated. But 1 interaction like this is better than none. This proves that we can all understand each other if we’re willing to put ego aside for a bit. You helpede push that a hit, and I really appreciate it.
If a boy fantasises sexually about a girl, is that harmful to her? If he tells his friends about it? No, this is not harmful - these actions do not affect her in any way. What affects the girl is how the boys might then treat her differently than they would do someone they don’t find sexually attractive.
The solution, in both cases, has to be to address the harmful behaviour. The only arguments for criminalising deepfakes themselves are also arguments for criminalising sexual fantasies. that is why people are talking about thought crime, because once you criminalise things that are harmless on their own, but which might down the line lead to directly harmful behaviour, there is no other distinction.
Both of these, for example, apply just as readily to discussing a shared sexual fantasy about someone who didn’t agree to it.
No distinction, that is, other than this is new and icky. I don’t want government policy to be dictated by fear of the new and by what people find icky, though. I do lots of stuff people find icky.
No an image that is shared and distributed is not the same as a fantasy in someone’s head. That is deranged. Should CSAM also be legal because making it illegal is like criminalizing the fantasies of pedophiles? Absolutely insane logical framework you have there.
This isnt fantasy. It is content. It is media. It is material. It is produced without the consent of the girls and women being sexualized and it commodifies their existence, literally transforming the idea of them into sexual media consumed for the gratification of boys and men.
It is genuinely incredible to me that you could be so unempathetic, so impassive, so detached from the real world and the consequences of this, that you could even make this comparison. You have seemingly no idea what youre talking about if you believe that pornography is the same thing as mental fantasies.
And even in the case of mental fantasies, are those all good? Is it really a good thing that boys see the mere existence of the girls around them as inherently some kind of sexual availability?
When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.
For all you have said - “without the consent” - “being sexualised” - “commodifies their existence” - you haven’t told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:
I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.
Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?
The harm is:
No, but the harm certainly is not the same as CSAM and it should not be treated the same.
as far as I know there is no good evidence that this is the case and is a big controversy in the topic of fake child porn, i.e. whether it leads to more child abuse (encouraging paedophiles) or less (gives them a safe outlet) or no change.
If someone fantasises about me without my consent I do not give a shit, and I don’t think there’s any justification for it. I would give a shit if it affected me somehow (this is your first bullet point, but for a different situation, to be clear) but that’s different.
Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.
Hey, it’s OK to say you just don’t have any counter-argument instead of making blatantly false characterisations.
I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.
We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone’s images into AI generated pornography. It should also be illegal to share those images with others.
Why is it these things? Why does someone doing something with something which is not your body make it feel like your body doesn’t belong to you? Why does it not instead make it feel like images of your body don’t belong to you? Several of these things could equally be used to describe the situation when someone is fantasised about without their knowledge - why is that different? In Germany there’s a legal concept called “right to one’s own image” but there isn’t in many other countries, and besides, what you’re describing goes beyond this.
My thinking behind these questions is that I cannot see anything inherent, anything necessary about the creation of fake sexual images of someone which leads to these harms, and that instead there is an aspect of our society which very explicitly punishes and shames people - woman far more so than men - for being in this situation, and that without that, we would be having a very different conversation.
Starting from the position that the harm is in the creation of the images is like starting from the position that the harm of rape is in “defiling” the person raped. Rape isn’t wrong because it makes you worthless to society - society is wrong for devaluing rape victims. Society is wrong for devaluing and shaming those who have fake images made of them.
Can you be more explicit about what it’s the same as?
The sexualization of women and girls is pervasive across literally every level of western culture. What do you think the purpose is of the victims head and face being in the image? Do you believe that it plays an incidental and unrelated role? Do you believe that finding out that, there is an entire group of people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body, has no psychological consequences for you whatsoever? I’m just talking about it and it makes me want to throw up. It is a fucking nightmare. This is not normal. This is not creating a healthy relationship with sexuality and it is enforcing a view of women and their bodies existing for the gratification of men.
You continuously attempt to extrapolate some very bizarre metaphors about this that are not at all applicable. This scenario is horrifying. Teenage girls should not be subject to scenarios like this. It is sexual exploitation. It is dehumanization. It promotes misogynistic views of women. This is NOT a matter of sexual liberation. Youre essentially saying that men and boys can’t be expected to treat girls and women as actual people and instead must be allowed to turn their friends and peers into fetishized media content they can share amongst each other. Thats fucking disgusting. The longer you talk the more you start to sound like an incel. I’m not saying you are one, but this is the kind of behavior that they defend.
Do you think the consequences of finding out are significantly different than finding out they’re doing it in their imagination? If so, why?
And, just to be clear, by this you mean the stuff with pictures, not talking or thinking about them? Because, again, the words “media content” just don’t seem to be key to any harm being done.
Your approach is consistently to say that “this is harmful, this is disgusting”, but not to say why. Likewise you say that the “metaphors are not at all applicable” but you don’t say at all what the important difference is between “people who you thought were your friends but are in actuality taking pictures of your head and masturbating to the idea of you performing sex acts for them using alorthimically derived likenesses of your naked body” and “people who you thought were your friends but are in actuality imagining your head and masturbating to the idea of you performing sex acts for them using imagined likenesses of your naked body”. Both acts are sexualisation, both are done without consent, both could cause poor treatment by the people doing it.
I see two possiblities - either you see this as so obviously and fundamentally wrong you don’t have a way of describing way, or you know that the two scenarios are fundamentally similar but know that the idea of thought-crime is unsustainable.
Finally it’s necessary to address the gendered way you’re talking about this. While obviously there is a huge discrepancy in male perpetrators and female victims of sexual abuse and crimes, it makes it sound like you think this is only a problem because, or when, it affects women and girls. You should probably think about that, because for years we’ve been making deserved progress at making things gender-neutral and I doubt you’d accept this kind of thing in other areas.
There is an institution in society specifically designed to strip women of their autonomy, reduce them down to their sexual appeal to men, and proliferate the notions of their inherent submission to men. This simply does not exist the other way. This will not be a major problem for boys, teenage girls are not creating fucking AI porn rings with pictures of boys from their classes. That isnt happening. Will someone do it? Almost certainly. Is it a systemic issue? No. Men’s bodies are not attacked institutionally in this way.
And youre still trying to equate imagination with physical tangible media. And to be clear, if several of my friends said they were collectively beating off to the idea of me naked, I would be horrified and disgusted. The overwhelming majority of people would. Again, they’ve taken you an actual person they know and are friends with, and have turned you into a sexual goal to be attained. It is invasive, exploitative, and above all else dehumanizing. Yeah if even one of my friends told me he jerked off to the thought of me naked I would never see him the same way again and would stop being friends with him. If I was a teenager it would probably fuck me up pretty bad to know that someone who I thought was my friend just saw me as a collection of sexual body parts with a face attached. If I found that a whole group of boys, some who i might not even know, were sharing AI generated porn with my face it would be severely psychologically traumatizing and probably shake my trust in men and boys for the rest of my life. This isn’t a fucking game. Youre acting like this is normal, its NOT FUCKING NORMAL. Photoshopping a girl in your classes face onto a nude body and sharing it with a group of boys is NOT NORMAL. That is severely disturbed behavior. That shows a complete malfunction in your empathy. It does if thats your imagination too. And finding that out, that somebody has done that, is absolutely repulsive.
And no I find it perfectly sustainable. We have no means by which to detect pedophiles by their thoughts. But pedophilic thoughts are still wrong and are not something we tolerate people expressing. Creating CSAM is still illegal, whether or not the child is aware such content is being created of them. They cant consent to that as they are children. This is the same. No we cant fucking read people’s thoughts and punish them for them. Having thoughts like that is absolutely a sign of some obsessive tendencies and already forming devaluation of women and girls and reduction of them to their bodies, but the correct thing is for them to receive counseling and proper education about sex and relationships. Creating, sharing and distributing AI generated porn of someone is so fundamentally different from that I have to think you have a fundamental misunderstanding about what an image is. This isnt a fucking thought. These boys and men can do whatever they want with this pornography they’ve made of you, can send it to whoever they want and share it as far and wide as they want. They have literally created porn of you without your consent. And for teenage girls this is a whole other level of fucked up. This is being used to produce CSAM. They cannot consent to this. It is a provable act of violation of women and girls. This should be illegal and should be treated extremely seriously when teenage boys are found to have done it.
You all say youre feminists until someone comes after your fucked up sexualities and your porn addictions. Always the same.
So the fundamental reality is that imagination and physical tangible media are very similar in this regard. That’s what you just said.
And if they were just talking about a shared fantasy - with your face? You still have the “ring” aspect, the stranger aspect, the dehumanising aspect, etc.
This is why there’s the connection that I keep getting at: there are many similarities, and you even say you’d feel similarly in both circumstances. So, the question is: do we go down the route of thought crime and criminalise the similar act? Or do we use this similarity to realise that it is not the act that is the problem, but the effects it can have on the victim?
Why do you think doing either thing (imagined or with pictures) means that someone just sees the person as a “collection of sexual body parts with a face attached”? Why can’t someone see you as an ordinary human being? While you might not believe that either thing is normal, I can assure you it is prevalent. I’m sure that you and I have both been the subject of masturbatory fantasies without our knowledge. I don’t say that to make you feel uncomfortable (and am sorry if it does) but to get you to think about how those acts have affected you, or not.
You talk again about how an image can be shared - but so can a fantasy (by talking about it). You talk again about how it’s created without consent - but so is a fantasy.
Another thought experiment: someone on the other side of the world draws an erotic image, and it happens by pure chance to resemble a real person. Has that person been victimised, and abused? Does that image need to be destroyed by the authorities? If not, why not? The circumstances of the image are the same as if it were created as fake porn. If it reached that person’s real circle of acquaintances, it could very well have the same effects - being shared, causing them shame, ridicule, abuse. It’s another example that shows how the problematic part is not the creation of an image, but the use of that image to abuse someone.
It’s my view that paedophilia, un-acted upon, is not wrong, as it harms no-one. A culture in which people are shamed, dehumanised and abused for the way their mind works is one in which those people won’t seek help before they act on those thoughts.
It’s kind of shocking to see you again erase male victims of (child) sexual abuse. For child abuse specifically, rates of victimisation are much closer than for adults.
Luckily I know you’re not representative of all of any group of people.
Your thought experiment is moot as these are real people. Youre still not getting it. Youre still seemingly fundamentally confused about why having porn made of you without your consent is wrong.
I dont think pedophilic thoughts should ever be tolerated outside a counselors office. If I found out one of my friends was a pedophile I would never speak with them again. End statement. You are in a very very very small minority of people if you disagree.
You skipped over the section where I said that a group of boys collectively sharing in a fantasy of one of their female peers and using that fantasy to sexually gratify themselves would be severely psychologically traumatizing for the victim.
Don’t make porn of people without their consent. You should face legal consequences for making porn of someone without their consent. The difference between fantasy and porn is that porn is media content, it is a real image or video and not an imagination in someone’s mind. If the fantasy is being written down and then shared then its kind of erotica isnt it, and I also think its extremely fucked up to write erotica about someone you know. Don’t do that either. Wild.
That doesn’t make sense at all. That real people are affected means it is important to get this right, which means it is necessary to think carefully about it. We don’t disagree that real people are getting hurt but it seems to me that you take that to mean we should immediately jump to the first solution without regard for getting it right.
You have again not taken the opportunity to say how that translates to differing harm and hence the necessity of a differing approach, even though when you talk about the harms you always talk about things that are the same between the two things.
Yeah I know. I think the world is extremely backwards about paedophilia because the abhorrence of the crime of child sexual abuse gives them a blind-spot and makes them unable to separate the abhorrent act from the thought. I would have to guess that this is also what’s going on here (but this is less extreme). That is, I think, confirmed by your rejection of making thought experiments due to the situation involving “real people”, as if it is therefore impossible to think clearly about - maybe for you it is.
I can only hope that people learn to do so, because the current situation causes abuse (in the case of paedophiles) and is likely to lead down the road of wrongly punishing people for things done in private without external repercussions (in other cases).
Theres no other solution to this. Again, dont make porn of people without their consent. Its not hard. If thats hard for you, then you need to seek help.
I talk about things that are the same to dismiss that the question of difference even matters. They are both harmful, should both be discouraged, and one results in the creation of non-consentual porn of the victim which is provable and should be illegal.
We hate pedophiles because children cannot consent. Children do not have sexuality in the same way that adults do. Being attracted to children is an attraction to exploitation, to the desire to victimize someone. Thats abhorrent. It is not a sexual orientation that the pedophile has no choice in. They have protected and engaged with a sexual fantasy of being able to victimize a child. I would never speak to someone again if they told me they were a pedophile. Most people wouldn’t. Thats not a failure of society, it is socially necessary for such thoughts to be treated as unacceptable in all contexts. Pedophiles should be forcefully institutionalized and subject to extensive psychotherapy and monitoring.
Its the difference between writing about genocide of a fictional race and writing about genocide of a real race. The line between fiction and reality is of extreme moral relevance. Incidentally drawing something that happens to look like someone you’ve never seen and drawing someone you have seen is entirely different. Even if the output is the same. Because we recognize intent. We recognize context. You also keep asking what the harm is in creating porn of people without their consent, and ive already pointed out that its dehumanizing it is invasive it is exploitative it devalues women and girls and reduces them to their bodies, yet you still seem to have trouble empathizing with women and girls in this situation.
Do you like to make porn of people without their consent? Is that a passtime of yours? I can genuinely think of no other reason why you would be so incapable of empathizing with the victims in this situation. You sound like you need help, you might have a disorder that interferes with your ability to fully connect with and understand the emotional experiences of other people.
OK, so you only stop short of making a thought crime because you can’t prove it. That’s… consistent but extremely concerning. You have no business policing what people think about. Freedom of thought is a fundamental right and what goes on inside other people’s heads is no-one’s business but their own unless they choose otherwise.
This ought to be the trigger to realise that you’ve got something wrong in this worldview. Even if not, it’s my trigger to know that I’m not going to get anywhere, so this will be my last reply. If someone thinks that the only issue with thought crimes is in gathering evidence, our views on morality and the limits of authority are diametrically opposed and there is no point trying, but at least I understand. If it’s the thought you really want to control, then you wouldn’t have any issue with the person who makes something harmful by accident.
Disturbing that you can’t recognise how disturbing this language is. But sure: threaten people with being locked up for unchangeable yet not harmful aspects of their selves, just to make sure that they never seek help to keep from causing harm. Morality aside this can’t have any negative consequences.
Everything I have read suggests that paedophiles have no control over their attraction, only over their actions. Here’s a thought experiment which I doubt you’ll bother trying: could you decide to be attracted to children? I couldn’t. It seems to be exactly like a sexual orientation in that respect.
Your inability to engage with points of view different from your own is problematic. The victims in your narratives are always female, the perpetrators always male. Those who disagree with you are always evil perpetrators. I only say this now that I’m disengaging because there’s no point in being drawn on provocative nonsense while trying to sustain a conversation.
Yup extrapolate my opinions on other things based on this one conversation where you are hellbent on justifying people making nonconsentual pornography of women and girls.
Yeah you’re right I do not empathize with pedophiles. Is that supposed to be a gotcha or something? It should be entirely socially intolerable to be a pedophile. It makes you a danger to some of the most vulnerable people in society. Some psychological conditions make you dangerous and require you to be institutionalized. Being attracted to the idea of victimizing children is one of them.
As for the thought crime nonsequitor (since we are talking about creating AI porn), yeah I’m really not interested in the hypothetical reality where we can read thoughts. We cant, and thats not whats being discussed, you have from the outset been deadset on taking the conversation there despite its entire lack of relevance to making pornography of someone without their consent.
I also did say I have no issue with someone making a drawing that happens to look like someone they dont know and have never seen. Its the context, random internet guy who is still somehow incapable of understanding the harm of making non-consenting porn of your classmates and friends but is capable of empathizing with and defending pedophiles, that matters. Its the fact that the porn is of a real person that’s relevant. A real human being who has had their likeness taken and converted into material for sexual gratification by the people in their life. That shouldn’t happen to anyone, no matter their gender. But men and boys are not out here having their bodies sexualized and policed by the state in the same way women and girls are. This whole subject affects women and girls many times more than it does men and boys. It is a systemic issue for women and girls. It connects with other things, like cat calling and body standards and sexualization of the female body. It becomes part of a system. And if the people doing it are teenage boys, it is the perfect introduction to the idea that women and girls bodies belong to them. They dont even have to ask or consider their feelings or emotions before turning them into sexual material for them to consume.
Youre trying really hard to characterize me one way or another on subjects that aren’t related to this central theme. You are very defensive of the subject and seem to think its impossible for boys and men to simply not make non-consentual pornography of women and girls. Its as easy as that. Just don’t do that. I have not stated my intention to make thought crime a thing, read my past comments I explain that I would still be disgusted and horrified to discover a group of men had been sharing a group fantasy about committing sexual acts upon me. I see those thoughts as harmful in the first place, as I also stated before. But I have made no statements about making those thoughts illegal. Will I never speak with someone again if they told me I was their masturbation muse? Yup. Goodbye, never ever ever speaking to that person again. If it was a group of people? Yup, id definitely be psychologically traumatized by a group of people coming together and reducing me down to a sexual experience that they can masturbate about together. Yeah that’d fuck me up pretty bad, would never speak with any of them again and might consider restraining orders. But I never said anything about making those fantasies themselves illegal.
Content is different from thoughts. Writing a book is different than considering a plot in your head. Making a movie is different than imagining a scene in your mind. Building a house is different than considering floor plans. Pornography is different than fantasy. It is tangible outside of your mind. Humans are visual creatures. Pornography exists even once the creator is gone. It isnt a thought, it is tangible, you can see it. The harms are worse, as porn is real. It can be shared. It can be given to others. Different from thoughts, in just a glance porn made of you also shows you exactly in what ways the creator sees you. A visual representation of your dehumanization thats been shared with others. It is different in every single way. Our bodies are policed so extensively in this society and culture. Now we have to compare ourselves with the fake bodies that AI gives our exploiters. Now our nudity can be taken from us with just an image online. Even an innocent totally normal image isnt safe in any sense of the word. Algorithms have been made to take even that away from us.
You accuse me of being unempathetic to pedophiles, a charge I will accept. I am unempathetic to them. Its their fixation on abusing children to deal with, thats their burden to carry. Many who do get help abuse children later anyway. Because unlike a sexual orientation, pedophilia is being fixated on abuse itself. Like rapists or others who are fixated on inflicting sexual pain and torture on others.
Sexual attraction doesn’t necessarily involve dehumanization. Unlike most other kinds of interest in a human being, it doesn’t require interest in their personality, but these are logically not the same.
In general you are using emotional arguments for things that work not through emotion, but through literal interpretation. That’s like using metric calculations for a system that expects imperial. Utterly useless.
No, it’s not. It’s literally a photorealistic drawing based on a photo (and a dataset to make the generative model). No children have been abused to produce it. Laws work literally.
No, because the woman is not being literally sexually exploited. Her photo being used without consent is, I think, subject of some laws. There are no new fundamental legal entities involved.
I think I agree. But it’s neither child pornography nor sexual exploitation and can’t be equated to them.
There are already existing laws for such actions, similar to using a photo of the victim and a pornographic photo, paper, scissors, pencils and glue. Or, if you think the situation is radically different, there should be new punishable crimes introduced.
Otherwise it’s like punishing everyone caught driving while drunk for non-premeditated murder. One is not the other.
Hey so, at least in the US, drawings can absolutely be considered CSAM
Well, US laws are all bullshit anyway, so makes sense
Normally yeah, but why would you want to draw sexual pictures of children?
Suppose I’m a teenager attracted to people my age. Or suppose I’m medically a pedophile, which is not a crime, and then I would need that.
In any case, for legal and moral purposes “why would you want” should be answered only with “not your concern, go eat shit and die”.
I feel like you didn’t read my comment thoroughly enough. I said it can constitue CSAM. There is a surprising amount of leewat for teenagers of course.
But no, I’m not gonna let you get away that easily. I want to know the why you think it’s morally okay for an adult to draw sexually explicit images of children. Please, tell me how that’s okay?
Because morally it’s not your fucking concern what others are doing in supposed privacy of their personal spaces.
It seems to be a very obvious thing your nose doesn’t belong there and you shouldn’t stick it there.
I don’t need any getting away from you, you’re nothing.
No. That’s not a good enough excuse to potentially be abusing children.
I can’t think of a single good reason to draw those kinds of things. Like at all. Please, give me a single good reason.
It’s good enough for the person whose opinion counts, your doesn’t. And there’s no such potential.
Too bad.
To reinforce that your opinion doesn’t count is in itself a good reason. The best of them all really.
Okay so you have no reason. Which is because having sexually explicit images, drawn or otherwise, is gross and weird and disturbing. And the fact that you are continually doubling down shows me that you likely need your hard drives and notebooks checked.
Please don’t respond again unless you are telling me what country you are from so I can report you to the appropriate authorities.
People don’t need reasons to do things gross or disturbing or whatever for you in their own space.
Thankfully that’s not your concern, and would get you in jail if you tried to do that yourself. Also I’m too lazy for my porn habits to be secret enough, LOL.
I don’t think you understand. You’re the fiend here. The kind of obnoxious shit that thinks it’s in their right to watch after others’ morality.
I wonder, what if I’d try to report you and someone would follow through (unlikely, of course, without anything specific to report), hypothetically, which instances of stalking and privacy violations they’d find?
You really seem the kind.
Historically, the respectability of a woman depended on her sexuality. In many conservative cultures and communities, that is still true. Spreading the message that deepfakes are some particular horrible form of harassment reinforces that view.
If having your head on the model of a nude model is a terrible crime, then what does that say about the nude model? What does it say about women who simply happen to develop a larger bosom or lips? What does it say about sex before marriage?
The implicit message here is simply harmful to girls and women.
That doesn’t mean that we should tolerate harassment. But it needs to be understood that we can do no more to stop this kind of harassment than we can do to stop any other kind.
This is just apologia for the sexual commodification and exploitation of girls and women. There literally is no girl being sexually liberated here, she has literally had the choice taken from her. Sexual liberation does NOT mean “boys and men can turn all women into personal maturation aids”. This ENFORCES patriarchy and subjugation of women. It literally teaches girls that their bodies do not belong to them, that its totally understandable for boys to strip them of humanity itself and turn them into sex dolls.
The most deepfaked women are certainly actresses or musicians; attractive people that appear on screens and are known by much of the population.
In some countries, they do not allow people to appear on-screen exactly because of that. Or at least, that’s one justification. If the honor or humanity of a woman depends on sexual feelings that she might or might not arouse in men, then women cannot be free. And men probably can’t be free either.
At no point have I claimed that anyone is being liberated here. I do not know what will happen. I’m just pointing out how your message is harmful.
Spoken like someone who hasn’t been around women.
You mean like a nerd who reads too much?
Furthermore, we generally assume malicious intent, but I wouldn't be surprised if teenagers were using the app to 'get' big boobs etc., we all have seen those shopped pictures with deformed background 😁
I’m not even going to begin describing all the ways that what you just said is fucked up.
I’ll just point out that online deepfake technology is FAR more accessible to the average 13 year old to use on their peers than “porno mags” were in our day.
You want to compare taking your 13 year old classmates photo off of Facebook, running it through an AI and in five seconds creating photo-realistic adult content featuring them, and compare that to getting your dad’s skin-mag from under his mattress when he’s not home, cutting your classmates face out of a yearbook, taping it on, then sneaking THAT into the computer lab at school so that you can photocopy it and pass it around in home room, and then putting the skin-mag BACK under the mattress before your dad finds out.
Is that right…is THAT what you’re trying to say? Are those the two things that you’re trying say are equivalent?
Yes, we all know it’s fucked up. The point is that we don’t need a new class of laws just because it’s harassment and bullying ✨with AI✨.
So is this a way to take away rights by making it about kids?
I mean what the fuck. We did much less and got punished right? It didn’t matter if we were on the property. Schools can hold students accountable for conduct with other students.
The leaded-gas adults of the time had no problem dealing with the emergence of cell phones. It was a distraction. They didn’t need lawmakers to call it something specific. My Pokemon cards caused fights and were banned. No lawmakers needed.
The problem is surely with the interaction between parents and schools. Or maybe it’s just the old way of thinking. Maybe it’s better to have police and courts start taking over discipline in schools.
How is a school going to regulate what kids do outside of school property? They could ban cell phones on campus but that’s not going to change what happens after hours.
Schools can already do that though. You can get in trouble for bullying outside of school, and when i was a student athletes i had pretty strict restrictions on what i was allowed to do because i was an “ambassador” for the school.
And you think these are positive things?
Overall, I would say so yeah.
For the bullying thing, not everyone’s parents are available or willing to discipline their kids.
And for the athletics thing, personally I believe that athletics is more about developing young adults into good people rather than the sport itself. And my school had a bunch of other things like grade minimums, required volunteer hours, we would wear dress shirts and ties before meets, and some other things like that.
All your examples are of things that were stopped while at school, so your argument doesn’t really carry over. You still had your pokemon cards everywhere else.
If kids want to be protected they need to get some better lobbyists. /s
I’m fairly well versed in tech and home labbing. I’ve never heard of tools that do this, generate images, etc. Not good ones anyhow. I could use those type of generation for business marketing to develop business cards, marketing materials. NOT FOR PEOPLE GENERATION. Anyone have a list of the best tools? GPT sucks at doing this I’ve tried.
Take a look at InvokeAI.
Thanks. Rather than everyone downvoting for no real reason. Finally someone was atleast trying to be helpful. I will use it for super simple business card proofs or basic brochures that’s about it. Magnets, etc.
No problem. If that doesn’t work for you, ComfyUI is also a popular option, but it’s more complicated.
I’m sure the laws will focus on protecting IP - specifically that of AI companies or megacorps, the famous and powerful, but not the small creators of content or the rabble negatively affected by AI abuse.
The rest of us will have to suffer through presenting whatever damaging and humiliating video to a court. If you can even afford a lawyer to do so. Then be offered a judgement that probably won’t be paid or won’t cover the damage done by an image that will never be able to be erased from the internet. Those damages could include the suicide of young people bullied and humiliated by such deepfakes.
My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!
Thanks, cap’n.
this advice might get you locked up
My mama also told me that if someone locks you up, then you just lock them up right back.
In the bible, it says, and I quote: “If a deepkfake of you is made, you shall give the creator more material to create deepfakes”
An eye for an eye, a tooth for a tooth, and a deepfake for a deepfake.
Instead of laws keeping up It also might turn out to be a case where culture keeps up.
Aren’t there already laws against making child porn?
I’d rather these laws be against abusing and exploiting child, as well as against ruining their lives. Not only that would be more helpful, it would also work in this case, since actual likeness are involved.
Alas, whether there’s a law against that specific use case or not, it is somewhat difficult to police what people do in their home, without a third party whistleblower. Making more, impossible to apply laws for this specific case does not seem that useful.
There is also a difference between somebody harassing somebody with nude pictures (either real or not) than somebody jerking off to them at home. It does become a problem when an adult masturbated to pictures of children, but children to children. Let’s be honest, they will do it anyway.
I don’t understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn’t that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it’s source material is would be the obvious choice here
not necessarily. image generation models work on a more fine-grained scale than that. they can seamlessly combine related concepts, like “photograph”+“person”+“small”+“pose” and generate plausible material due to the fact that all of those concepts have features in common.
you can also use small add-on models trained on very little data (tens to hundreds of images, as compared to millions to billions for a full model) to “steer” the output of a model towards a particular style.
you can make even a fully legal model output illegal data.
all that being said, the base dataset that most of the stable diffusion family of models started out with in 2021 is medical in nature so there could very well be bad shit in there. it’s like 12 billion images so it’s hard to check, and even back with stable diffusion 1.0 there was less than a single bit of data in the final model per image in the data.
This is mostly about swapping faces. You take a video and a photo of someone’s face. Software can replace the face of someone in the video with that face. That’s been around for a decade or so. There are other ways of doing it.
When the face belongs to an underage individual, and the video is pornographic…
LLMs only do text.
I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.
Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.
Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that’s really good at turning blurry faces into that particular person’s face.
Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.
Cheers for the explanation, had no idea that’s how it works.
So it’s even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!
You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.
People have also put dots on people’s clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.
There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don’t necessarily need to use illegal material to train to get illegal output.
AI can generate images of things that don’t even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.
You know how when you look at a picture of someone and you cover up the clothed bits, they look naked. Your brain fills in the gaps with what it knows of general human anatomy.
It’s like that.
That’s just on it’s face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That’s not pedophilia. It’s wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.
It shouldn’t be treated the same as when an adult man generates it; there should be nuance. I’m not saying it’s ok for a thirteen year old to generate said content: I’m saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.
I’m so glad I went through puberty at a time when this kind of shit wasn’t available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.
As a father of teenage girls, I don’t necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.
Yes, absolutely. But with recognition that a thirteen year old kid isn’t a predator but a horny little kid. I’ll let others determine what that punishment is, but I don’t believe it’s prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we’re ratcheting up the punishment, but still not adult prison.
written apology? they’ll just use chatgpt for that
I did say equitable punishment. Equivalent. Whatever.
A written apology is a cop-out for the damage this behaviour leaves behind.
Something tells me you don’t have teenage daughters.
No kids. That’s why I say others should write the punishments. A written apology wasn’t meant as the only punishment. It was in addition to community service and other stipulations.
In a properly functioning world, this could easily be coupled with particular education on power dynamics and a lesson on consent, giving proper attention to why this might be more harmful to get than to him.
Of course, – so long as we’re in this hypothetical world – you’d just have that kind of education be a part of sex ed. or the like for all students, to begin with, but, as we’re in this world and that’s Louisiana…
There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.
Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.
And what about the life of the girl this boy would have ruined?
This is not “boys will be boys” shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).
I don’t think it’s unreasonable to expect an equivalent punishment that has the potential to ruin his life.
It is not abnormal to see different punishment for people under the age of 18. Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).
You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.
The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte
Parents are responsible for their kids. The punishment, with the full force of the law (and maybe something extra for good measure), should fall upon the parents, since they should have made sure their kids knew how despicable and illegal doing this is.
Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.
You’re a fucking asshole. This isn’t like prosecuting parents who let a school shooter have access to guns. The interenet is everywhere. Parents are responsible for bringing up their children to be socially responsible. A thirteen year old kid is anything but responsible (I mean their mentality / maturity, I’m not giving them a pass).
Go hang out with conservatives who want more policing. Over here, we’ll talk about social programs you fucking prick.
I am an asshole, that’s never been in question, and I fully own it. Having said that, no amount of “social programs” is going to have any effect if fucking parents don’t raise their kids right.
I’m entirely against surveillance, except when it comes to parents and keeping a close eye on everything their kids watch, browse or otherwise access (evidently making it known to the kids that “I can see EVERYTHING you see and do”).
So, yeah, hang the imbecile parents that should not have had kids in the first place because a fucking social program or school would raise them instead. Fuck off.
And thanks to the assholes in Congress who just passed the Big Betrayal Bill, those are all going away.
Teenagers are old enough to understand consequences.
In fact, my neighborhood nearly burned down last week because a teenager, despite being told “no” and “stop” multiple times - including by neighbors - decided to light off fireworks on the mountainside right behind the neighborhood.
<img alt="" src="https://lemmy.dbzer0.com/pictrs/image/a8f53ebc-bec7-4c2d-a651-8ac9ae22696c.webp">
Red arrow is my house. We were damn lucky the wind was blowing the right direction. If this had happened the day before, the neighborhood would be gone.
some day I hope to be brave enough to post pictures of my house on the internet
Fake pictures do not ruin your life… sorry…
Our puritanical / 100% sex culture is the problem, not fake pictures…
Punishment for an adult man doing this: Prison
Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.
13 year old: “I’ll just take the death penalty, thanks."
In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?
it existed if society liked you enough.
fascists just have a habit of tightening that belt smaller and smaller, is what’s going on.
I can already picture that as an Onion headline:
New York Renames State to ‘WokeVille’. NYC to follow.
Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they're here encouraging the next generation of misogynist scum by defending this shit, too.
And men (pretend to) wonder why we distrust them.
Ngl, I'm only leaving reply notifs on for this one to work on my blocklist.
Yeah there’s some nasty shit here. Big yikes, Lemmy.
Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.
It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.
That’s just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.
You can make areas safe from cameras. No, you cant make everywhere camera free but you can minimize your time in those areas. Im not saying its a good system it would just be adjusting to the times.
If the floor was lava and all that…
If you don’t know, don’t try? Seems a bit defeatist.
There’s also the matter of “you” the NPC and well… “You”.
You can rest easy knowing Trump knows you’re at work, but not the contents of the monologue you gave on Palestine on a political XMPP chatroom.
That’s what muslims do with niqabs.
Don’t trivialize the scramble suit, ok
mkay 😬
God I’m glad I’m not a kid now. I never would have survived.
In my case, other kids would not have survived trying to pull off shit like this. So yeah, I’m also glad I’m not a kid anymore.
probably because there's a rapist in the white house.
To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…
the toxic manosphere/blogosphere/whatever it's called has done so much lifelong damage
At least they’ve learned a skill?
Maybe let’s assume all digital images are fake and go back to painting. Wait… what if children start painting deepfakes ?
Or pasting someone’s photo over porn…in their minds…
anyone using any kind of AI either doesn’t know how consent works-- or they don’t care about it.
a horrifying development in the intersection of technofascism and rape culture
Any AI? Every application? What kind of statement is this?
AI models (unless you’re training your own) are usually trained on data it does not have a licence to use. The companies training these models are also notorious for ignoring robot.txt and other measures websites use to stop bots from trawling their data.
Like in crypto, most people in AI are not nerds, just criminal scum.
You are thinking of LLMs, not AI in general.
I am. And so is OC. Neural networks are a different beast, although neither is actual AI. Just a marketing term at this point.
Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.
<img alt="" src="https://lemmy.today/pictrs/image/d914882e-ae07-4adc-b3ed-f8018ebed066.jpeg">
<img alt="" src="https://lemmy.today/pictrs/image/9f167b4e-f86b-45d0-b201-1590949c2b2f.jpeg">
<img alt="" src="https://lemmy.today/pictrs/image/986b4cbe-87b8-48f0-b0f0-1ad2832addc2.jpeg">
<img alt="" src="https://lemmy.today/pictrs/image/7ce40414-38f4-4449-8f84-3bde5b0f58ac.jpeg">
Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.
These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.
It can be both. The cornerstone of why nudity can be abused, is that society makes it shameful to be bare. If some generations from now that people can just shrug and not care, that is one less tool an abuser can use against people.
In any case, I am of the mind that people of my generation might be doing their own version of the Satanic Panic, or the reaction against rap music. For better or worse, older people cannot relate to the younger.
Unless it is used to pretend that it is a real video and circulated for denigration or blackmail, it is very much not at all like assault. And also, deepfakes do not have the special features hidden under your clothes, so it is possible to debunk those if you really have to.
Back in my day we just had to use our own imagination.
Can’t afford this much cheese today to find just the right slice for every bikini photo…
Burkas for the win ?