NeoNachtwaechter@lemmy.world
on 20 Nov 08:07
nextcollapse
researchers concluded that “outlawing all deepfakes is unrealistic and unfeasible”—especially since all the harmful AI-generated images that are already out there are likely to “remain online indefinitely.”
Just think a little bigger:
It must be a crime to have the harmful material.
Have it on your PC or phone —> goto jail.
Have it in your online account —> goto jail.
Be a service provider and have it on your server —> goto jail.
sensiblepuffin@lemmy.world
on 20 Nov 15:02
nextcollapse
You really haven’t thought this through. What happens if I email you a bunch of illegal pictures? Guess we’re both going to jail.
NeoNachtwaechter@lemmy.world
on 20 Nov 16:36
collapse
Oh but I don’t need to think too much through it. It is pretty much the legal situation in Germany (and probably several other European countries). There may be some edge cases that I don’t know.
Of course, if you send stuff to me, then you are the first of the evil ones :) and if I can convince the judge that I did not know and did not want (!) the stuff (and by the way, how did they even know about it? Even before I had the chance to delete it?), there will be room for a reasonable decision.
KairuByte@lemmy.dbzer0.com
on 20 Nov 15:10
collapse
This will also make it trivial to target someone and have them sent to jail. I could literally post an image, right now, and it would be on your current device. Using your logic, you’d be liable and on your way to jail.
An AI-generated nude photo scandal has shut down a Pennsylvania private school. On Monday, classes were canceled after parents forced leaders to either resign or face a lawsuit potentially seeking criminal penalties and accusing the school of skipping mandatory reporting of the harmful images.
Classes are planned to resume on Tuesday, Lancaster Online reported.
webghost0101@sopuli.xyz
on 20 Nov 11:38
nextcollapse
I have mixed feelings about this prosecution of ai deepfakes.
Like obviously people should have protection against becoming a victim of such and perpetrators should be held accountable.
But the line “feds are currently testing whether existing laws protecting kids against abuse are enough to shield kids from AI harms” would be a incredibly dangerous precedent because those are mostly designed for actual physical sex crimes.
As wrong as it is to create and distribute ai generated sex imagery involving non consenting people it is not even remotely as bad as actual rape and distributing real photos.
rottingleaf@lemmy.world
on 20 Nov 12:16
nextcollapse
Creating and distributing anything should be legal if no real person suffers during its creation and if it’s not intended at defamation, forgery, such things.
Alphane_Moon@lemmy.world
on 20 Nov 12:48
nextcollapse
You would be fine with AI-gen porn images of your teenage daughter being distributed around the internet?
droporain@lemmynsfw.com
on 20 Nov 12:59
nextcollapse
Meanwhile in reality check out what she is distributing through Snapchat and only fans… Maybe pursuing the actual crimes first then if there’s spare resources go after fiction.
I don’t give a shit if she’s doing Shein bikini hauls on Youtube. If you use AI to nudify her pictures, you’re manufacturing child pornography, and deserve the full consequences for doing that.
As for OnlyFans, they are quite strict about age requirements. Children aren’t running OF accounts. You just hate women and needed to bring up OF to slut-shame.
rottingleaf@lemmy.world
on 20 Nov 13:43
nextcollapse
If you use AI to nudify her pictures, you’re manufacturing child pornography, and deserve the full consequences for doing that.
No, equating this to an actual child being raped is incorrect. These are not crimes of remotely equal magnitude.
Comparing a person who raped a child, made photos and distributed them to a person who used Photoshop or an AI tool is, other than just evil, reducing the meaning of the former.
It is weird how hard you have been defending the production of child pornography in this thread.
rottingleaf@lemmy.world
on 20 Nov 15:54
nextcollapse
What this conversation is about has as much to do with child pornography as hentai with loli characters.
You just can’t argue without unsubstantiated accusations, can you?
When real living people are being murdered and abused in droves, you are still worried more about glorified automated Photoshop and accusing its users of being the same as actual rapists.
What this conversation is about has as much to do with child pornography as hentai with loli characters.
Creating sexually explicit images of minors is child pornography.
You just can’t argue without unsubstantiated accusations, can you?
You literally confirmed my claim in your first sentence, and your last.
When real living people are being murdered and abused in droves, you are still worried more about glorified automated Photoshop and accusing its users of being the same as actual rapists.
Production of child pornography is production of child pornography. It does not need to involve rape. Producing child pornography is a separate crime.
Its users are pedophiles because they are producing child pornography. You are defending them.
Victimless crimes are not crimes. Thus producing any pornography is a crime only when it involves violating someone’s rights.
Its users are pedophiles because they are producing child pornography. You are defending them.
Ah, so you are dumb enough to think it’s bad to defend pedophiles who have not committed a crime against a real person?
Damn right, I am defending pedophiles who are being persecuted for being born with that deviation alone. I am also defending pedophiles who satisfy that via any means not harming real people. I will do both till my last breath.
If your argument is that they are disgusting and you don’t want them in society, then so are you.
Victimless crimes are not crimes. Thus producing any pornography is a crime only when it involves violating someone’s rights.
You mean like when someone takes a photo of a minor, removes their clothing to make a sexually explicit image, and uses that image to harass, bully, and extort?
Ah, so you are dumb enough to think it’s bad to defend pedophiles who have not committed a crime against a real person?
Taking a picture of a minor, making that image sexually explicit, and using it to harass, bully, and extort that minor is not a “crime against a real person”?
Damn right, I am defending pedophiles who are being persecuted for being born with that deviation alone. I am also defending pedophiles who satisfy that via any means not harming real people. I will do both till my last breath.
You should stop “defending” their “right” to child pornography and start advocating for them to get real help with the very serious mental disorder that causes them to want sexual activity with a minor instead.
If you argument is that they are disgusting and you don’t want them in society, then so are you.
My argument is that they should not be given child pornography. Your argument is that they should.
The disgusting people I don’t want in society are people who use child pornography, and those who defend their use of child pornography.
Kindly see yourself out and take the rest with you.
Taking a picture of a minor, making that image sexually explicit, and using it to harass, bully, and extort that minor is not a “crime against a real person”?
Doesn’t matter, that’s not what we are talking about here. You don’t have to use a face of a real child.
Oh, you wanted to pretend it is? Cheating doesn’t work with me.
Your argument is that they should.
No, my argument is what I myself already said.
The disgusting people I don’t want in society are people who use child pornography, and those who defend their use of child pornography.
It’s really not your concern what other people create for themselves. Nobody owes you any shame for being born with a flaw.
It’s really a good thing that people with this particular deviation can get materials satisfying them without harming real people. And if one can generate those materials - then that’s a noble endeavor. For every decent person, that is.
Kindly see yourself out and take the rest with you.
No, you are the one unwanted in civilized society.
BTW, for any normal person any pedophile that doesn’t hurt children is better than you.
Doesn’t matter, that’s not what we are talking about here. You don’t have to use a face of a real child.
Oh, you wanted to pretend it is? Cheating doesn’t work with me.
Despite your shit attitude, AI nudification is, in fact, what we are talking about. It’s what the OP article is about. Actual children were exploited and harmed.
You have decided to change the subject to “what if the child porn is 100% synthetic?”, which is a different thing than what everyone else has been talking about, but is fucked up just the same.
When confronted with the near universal attitude that CSAM is morally reprehensible, you have decided to lash out in anger and act like nobody knows what you’re talking about.
Don’t worry, we get it. You make fucked up pedo shit with Stable Diffusion on your gaming PC, and you get scared every time you see the very real consequences. You think you can change our minds about it by talking down to us, as if being against child pornography was a remotely controversial take.
And you are downplaying the very real crime against very real children the OP article describes because you are compelled to defend your own disgusting habit.
Seek help, and don’t fucking look at child porn. You’re doing irreparable damage to yourself. There are resources available to you: troubled-desire.com
I’m actually interested in dark grey eyed blondes just a bit taller than me, and dark hazel-eyed brunettes just a bit lower that me, and none of them have been much younger.
But thanks for confirming that you can’t argue without calling your opponent a pedo.
And even more that you really can’t comprehend that someone would argue hard in defense of someone else.
How can one be such a miserable creature is beyond me.
They are now at the point of calling me a disgusting person who doesn’t belong in civilized society because I am against the production, and use, of child pornography.
Give me a million attempts and I would never have guessed that is the person I would encounter today. haha
sunzu2@thebrainbin.org
on 20 Nov 14:22
nextcollapse
If you use AI to nudify her pictures, you're manufacturing child pornography, and deserve the full consequences for doing that.
Somebody in non US satellite foreign state can go and do that now from the youtube "bikini hauls" since they publicly avaoialble
What are you or the feds gonna about that, chief?
If that is your or her concern, don't post pictures online. Otherwise, you are literally the mercy of the internet. Privacy 101.
I am sure giving feds extra powers on this won't end like everything else, ie abused against lesser peons.
No I’m just pointing out the obvious fake morality. Big “somebody think of the children” energy here Todd. You just hate common sense and logic and are bringing it up because you need a knee jerk reaction to simulate an emotional response from real humans.
Even knowing what Internet is, I sometimes try to pretend the other side is arguing in good faith.
I mean, it’s as if someone pushed me and I would try to sue them for cutting my hand off. With that hand present.
I would understand the “this punishment is not enough, we have to do more” sentiment, but instead of “more” they are trying to alias a different action with an existing action with harsher punishment.
Bruh how is creating and distributing a non-consensual nude-ified picture of a young girl not a cause for suffering for the victim? Please, explain that to the class.
Did you just not go to school as a kid? If so, that would explain your absolute ineptitude on this topic. Your opinion is some real “your body, my choice” kind of energy.
rottingleaf@lemmy.world
on 20 Nov 13:39
nextcollapse
Read my comment again.
Your opinion is some real “your body, my choice” kind of energy.
My advice to you would be to improve your reading comprehension before judging this way.
There’s a legitimate discussion to be had about harm reduction here. You’re approaching this topic from an all-or-nothing mindset but there’s quite a bit of research indicating that’s not really how it works in practice. Specifically as it relates to child pornography the argument goes that not allowing artificial material to be created leads to an increase in production of actual child pornography which obviously means more real children are being harmed than would be if other forms were not controlled in the same fashion. The same sort of logic could be applied to revenge porn, stolen selfies, or whatever else we’re calling the kind of thing this article is referring to. It may not be an identical scenario but I still think it would be fair to say that an AI generated image is not as damaging as a real one.
That is not to say that nothing should be done in these situations. I haven’t decided what I think the right move is given the options in front of us but I think there’s quite a bit more nuance here than your comment would indicate.
It may not be an identical scenario but I still think it would be fair to say that an AI generated image is not as damaging as a real one.
“The deepfakes are often used to extort, harass or bully minors, she says, and are easy to make because of the many sites and apps that will “nudify” an image.”
I think this is probably a really good point. I have no issue with AI generated images, although obviously if they are used to do an illegal thing such has harassment or defamation, those things are still illegal.
I’m of two minds when it comes to AI nudes of minors. The first is that if someone wants that and no actual person is harmed, I really don’t care. Let me caveat that here: I suspect there are people out there who, if inundated with fake CP, will then be driven to ideation about actual child abuse. And I think there is real harm done to that person and potentially the children if they go on to enact those fantasies. However I think it needs more data before I am willing to draw a firm conclusion.
But the second is that a proliferation of AI CP means it will be very difficult to tell fakes from actual child abuse. And for that reason alone, I think it’s important that any distribution of CP, whether real or just realistic, must be illegal. Because at a minimum it wastes resources that could be used to assist actual children and find their abusers.
So, absent further information, I think whatever a person whats to generate for themselves in private is just fine, but as soon as it starts to be distributed, I think that it must be illegal.
essteeyou@lemmy.world
on 20 Nov 17:58
nextcollapse
Can you share a full-body shot of yourself please? Don’t worry, you won’t suffer while it gets used to create other content that we’ll distribute to your friends, family, classmates, coworkers, etc.
ShepherdPie@midwest.social
on 20 Nov 21:03
collapse
“Deepfakes” are edited pictures of real people. I’d be more inclined to agree with you on completely AI generated images but not something specifically intended to deceive others into thinking they’re viewing a real person’s image.
Deepfakes are, however the top-level comment I was answering was not limited to deepfakes. And as my further discussion with its author shows, they too didn’t mean only deepfakes.
Their opinion was that any kind of pornography portraying children, even if it’s not shared with others and not based on pics of real people, should be prosecuted just like making real child pornography.
You know, this thread has once again reinforced me in my opinion that the best system of government is Aspie Reich. Only people with Aspergers should be allowed to make laws and judge and hold public posts. The rest of fucking chimps just don’t have what it takes to override their chimp instincts.
boatswain@infosec.pub
on 21 Nov 02:24
nextcollapse
My understanding is that intention is not uncommonly litigated; I believe the question of “intent to deceive” is central to trademark law, for example. That’s also what the the “degrees” of murder etc are about.
Disclaimer: I’m not a lawyer. I do read an awful lot of contacts and talk to lawyers.
This is a sentence in natural language, want me to start asking such questions about everything you write?
If you make a deepfake of someone and share it, then it’s defamation. Taking a picture voluntarily shared and editing it is not a crime.
Blueberrydreamer@lemmynsfw.com
on 20 Nov 12:52
nextcollapse
I don’t think you’re on the right track here. There are definitely existing laws in most states regarding ‘revenge porn’, creating sexual media of minors, Photoshop porn, all kinds of things that are very similar to ai generated deep fakes. In some cases ai deepfakes fall under existing laws, but often they don’t. Or, because of how the law is written, they exist in a legal grey area that will be argued in the courts for years.
Nowhere is anyone suggesting that making deepfakes should be prosecuted as rape, that’s just complete nonsense. The question is, where do new laws need to be written, or laws need to be updated to make sure ai porn is treated the same as other forms of illegal use of someone’s likeness to make porn.
Nobody was “abused” this is out of hand. suspend the kid or whatever that did it, some kind of school punishment, but jail? And lawsuits over some ai images? Crazy.
hedgehogging_the_bed@lemmy.world
on 20 Nov 15:58
nextcollapse
The lawsuit was about the fact the school knew for months about the problem and did nothing to address it. If they plausibly couldn’t know, it wouldn’t have been their fault but this was reported to the admin repeatedly and they did nothing.
Exactly this, and rightly so. The school’s administration has a moral and legal obligation to do what it can for the safety of its students, and allowing this to continue unchecked violates both of those obligations.
surewhynotlem@lemmy.world
on 20 Nov 19:03
collapse
threaded - newest
Just think a little bigger:
It must be a crime to have the harmful material.
Have it on your PC or phone —> goto jail.
Have it in your online account —> goto jail.
Be a service provider and have it on your server —> goto jail.
This will reduce the stuff.
yuppp, exactly like in North Korea.
nice !
<img alt="" src="https://lemm.ee/pictrs/image/6a23cbb6-14fd-46fd-afff-9e81c157efe6.gif">
You really haven’t thought this through. What happens if I email you a bunch of illegal pictures? Guess we’re both going to jail.
Oh but I don’t need to think too much through it. It is pretty much the legal situation in Germany (and probably several other European countries). There may be some edge cases that I don’t know.
Of course, if you send stuff to me, then you are the first of the evil ones :) and if I can convince the judge that I did not know and did not want (!) the stuff (and by the way, how did they even know about it? Even before I had the chance to delete it?), there will be room for a reasonable decision.
This will also make it trivial to target someone and have them sent to jail. I could literally post an image, right now, and it would be on your current device. Using your logic, you’d be liable and on your way to jail.
Title is misleading?
So the school is still in operation.
Shut down for one day at least.
I have mixed feelings about this prosecution of ai deepfakes.
Like obviously people should have protection against becoming a victim of such and perpetrators should be held accountable.
But the line “feds are currently testing whether existing laws protecting kids against abuse are enough to shield kids from AI harms” would be a incredibly dangerous precedent because those are mostly designed for actual physical sex crimes.
As wrong as it is to create and distribute ai generated sex imagery involving non consenting people it is not even remotely as bad as actual rape and distributing real photos.
Creating and distributing anything should be legal if no real person suffers during its creation and if it’s not intended at defamation, forgery, such things.
You would be fine with AI-gen porn images of your teenage daughter being distributed around the internet?
Meanwhile in reality check out what she is distributing through Snapchat and only fans… Maybe pursuing the actual crimes first then if there’s spare resources go after fiction.
Big “but what was she wearing?” energy here.
I don’t give a shit if she’s doing Shein bikini hauls on Youtube. If you use AI to nudify her pictures, you’re manufacturing child pornography, and deserve the full consequences for doing that.
As for OnlyFans, they are quite strict about age requirements. Children aren’t running OF accounts. You just hate women and needed to bring up OF to slut-shame.
No, equating this to an actual child being raped is incorrect. These are not crimes of remotely equal magnitude.
Comparing a person who raped a child, made photos and distributed them to a person who used Photoshop or an AI tool is, other than just evil, reducing the meaning of the former.
It is weird how hard you have been defending the production of child pornography in this thread.
What this conversation is about has as much to do with child pornography as hentai with loli characters.
You just can’t argue without unsubstantiated accusations, can you?
When real living people are being murdered and abused in droves, you are still worried more about glorified automated Photoshop and accusing its users of being the same as actual rapists.
Creating sexually explicit images of minors is child pornography.
You literally confirmed my claim in your first sentence, and your last.
Production of child pornography is production of child pornography. It does not need to involve rape. Producing child pornography is a separate crime.
Its users are pedophiles because they are producing child pornography. You are defending them.
These are the facts.
Victimless crimes are not crimes. Thus producing any pornography is a crime only when it involves violating someone’s rights.
Ah, so you are dumb enough to think it’s bad to defend pedophiles who have not committed a crime against a real person?
Damn right, I am defending pedophiles who are being persecuted for being born with that deviation alone. I am also defending pedophiles who satisfy that via any means not harming real people. I will do both till my last breath.
If your argument is that they are disgusting and you don’t want them in society, then so are you.
You mean like when someone takes a photo of a minor, removes their clothing to make a sexually explicit image, and uses that image to harass, bully, and extort?
Taking a picture of a minor, making that image sexually explicit, and using it to harass, bully, and extort that minor is not a “crime against a real person”?
You should stop “defending” their “right” to child pornography and start advocating for them to get real help with the very serious mental disorder that causes them to want sexual activity with a minor instead.
My argument is that they should not be given child pornography. Your argument is that they should.
The disgusting people I don’t want in society are people who use child pornography, and those who defend their use of child pornography.
Kindly see yourself out and take the rest with you.
Doesn’t matter, that’s not what we are talking about here. You don’t have to use a face of a real child.
Oh, you wanted to pretend it is? Cheating doesn’t work with me.
No, my argument is what I myself already said.
It’s really not your concern what other people create for themselves. Nobody owes you any shame for being born with a flaw.
It’s really a good thing that people with this particular deviation can get materials satisfying them without harming real people. And if one can generate those materials - then that’s a noble endeavor. For every decent person, that is.
No, you are the one unwanted in civilized society.
BTW, for any normal person any pedophile that doesn’t hurt children is better than you.
The only people who defend child pornography this hard are pedophiles, and I am not going to continue to argue with a pedophile.
I hope you get the help you desperately need before it is too late.
.
Despite your shit attitude, AI nudification is, in fact, what we are talking about. It’s what the OP article is about. Actual children were exploited and harmed.
You have decided to change the subject to “what if the child porn is 100% synthetic?”, which is a different thing than what everyone else has been talking about, but is fucked up just the same.
When confronted with the near universal attitude that CSAM is morally reprehensible, you have decided to lash out in anger and act like nobody knows what you’re talking about.
Don’t worry, we get it. You make fucked up pedo shit with Stable Diffusion on your gaming PC, and you get scared every time you see the very real consequences. You think you can change our minds about it by talking down to us, as if being against child pornography was a remotely controversial take.
And you are downplaying the very real crime against very real children the OP article describes because you are compelled to defend your own disgusting habit.
Seek help, and don’t fucking look at child porn. You’re doing irreparable damage to yourself. There are resources available to you: troubled-desire.com
I’m actually interested in dark grey eyed blondes just a bit taller than me, and dark hazel-eyed brunettes just a bit lower that me, and none of them have been much younger.
But thanks for confirming that you can’t argue without calling your opponent a pedo.
And even more that you really can’t comprehend that someone would argue hard in defense of someone else.
How can one be such a miserable creature is beyond me.
People can be worried about more than one thing at a time.
Fr, bro is giving off some strong Trumpist vibes.
They are now at the point of calling me a disgusting person who doesn’t belong in civilized society because I am against the production, and use, of child pornography.
Give me a million attempts and I would never have guessed that is the person I would encounter today. haha
Somebody in non US satellite foreign state can go and do that now from the youtube "bikini hauls" since they publicly avaoialble
What are you or the feds gonna about that, chief?
If that is your or her concern, don't post pictures online. Otherwise, you are literally the mercy of the internet. Privacy 101.
I am sure giving feds extra powers on this won't end like everything else, ie abused against lesser peons.
No I’m just pointing out the obvious fake morality. Big “somebody think of the children” energy here Todd. You just hate common sense and logic and are bringing it up because you need a knee jerk reaction to simulate an emotional response from real humans.
I take it, the word “defamation” is not part of your lexicon.
The issue being discussed does not fall under defamation.
Making forged pics of someone else falls under defamation.
It’s very clearly not rape, sexual abuse, child pornography or non-consensual pornography.
Welcome to the internet.
Even knowing what Internet is, I sometimes try to pretend the other side is arguing in good faith.
I mean, it’s as if someone pushed me and I would try to sue them for cutting my hand off. With that hand present.
I would understand the “this punishment is not enough, we have to do more” sentiment, but instead of “more” they are trying to alias a different action with an existing action with harsher punishment.
Bruh how is creating and distributing a non-consensual nude-ified picture of a young girl not a cause for suffering for the victim? Please, explain that to the class.
Did you just not go to school as a kid? If so, that would explain your absolute ineptitude on this topic. Your opinion is some real “your body, my choice” kind of energy.
Read my comment again.
My advice to you would be to improve your reading comprehension before judging this way.
In particular, the word “defamation”.
There’s a legitimate discussion to be had about harm reduction here. You’re approaching this topic from an all-or-nothing mindset but there’s quite a bit of research indicating that’s not really how it works in practice. Specifically as it relates to child pornography the argument goes that not allowing artificial material to be created leads to an increase in production of actual child pornography which obviously means more real children are being harmed than would be if other forms were not controlled in the same fashion. The same sort of logic could be applied to revenge porn, stolen selfies, or whatever else we’re calling the kind of thing this article is referring to. It may not be an identical scenario but I still think it would be fair to say that an AI generated image is not as damaging as a real one.
That is not to say that nothing should be done in these situations. I haven’t decided what I think the right move is given the options in front of us but I think there’s quite a bit more nuance here than your comment would indicate.
“The deepfakes are often used to extort, harass or bully minors, she says, and are easy to make because of the many sites and apps that will “nudify” an image.”
cbc.ca/…/deepfake-minors-porn-explicit-images-1.7…
I think this is probably a really good point. I have no issue with AI generated images, although obviously if they are used to do an illegal thing such has harassment or defamation, those things are still illegal.
I’m of two minds when it comes to AI nudes of minors. The first is that if someone wants that and no actual person is harmed, I really don’t care. Let me caveat that here: I suspect there are people out there who, if inundated with fake CP, will then be driven to ideation about actual child abuse. And I think there is real harm done to that person and potentially the children if they go on to enact those fantasies. However I think it needs more data before I am willing to draw a firm conclusion.
But the second is that a proliferation of AI CP means it will be very difficult to tell fakes from actual child abuse. And for that reason alone, I think it’s important that any distribution of CP, whether real or just realistic, must be illegal. Because at a minimum it wastes resources that could be used to assist actual children and find their abusers.
So, absent further information, I think whatever a person whats to generate for themselves in private is just fine, but as soon as it starts to be distributed, I think that it must be illegal.
That’s a fairly decent and nuanced take.
Can you share a full-body shot of yourself please? Don’t worry, you won’t suffer while it gets used to create other content that we’ll distribute to your friends, family, classmates, coworkers, etc.
You first.
EDIT: Also I weren’t talking about pics of real people.
Oh, so “anything” doesn’t mean what it used to mean?
I said “defamation”. If you are not capable of reading, that’s not my fault.
That’s why I’m ancap, you can’t deal with such chimp crowds without private tanks.
K dude.
“Deepfakes” are edited pictures of real people. I’d be more inclined to agree with you on completely AI generated images but not something specifically intended to deceive others into thinking they’re viewing a real person’s image.
Deepfakes are, however the top-level comment I was answering was not limited to deepfakes. And as my further discussion with its author shows, they too didn’t mean only deepfakes.
Their opinion was that any kind of pornography portraying children, even if it’s not shared with others and not based on pics of real people, should be prosecuted just like making real child pornography.
You know, this thread has once again reinforced me in my opinion that the best system of government is Aspie Reich. Only people with Aspergers should be allowed to make laws and judge and hold public posts. The rest of fucking chimps just don’t have what it takes to override their chimp instincts.
How do you litigate ‘intention’ in this way?
My understanding is that intention is not uncommonly litigated; I believe the question of “intent to deceive” is central to trademark law, for example. That’s also what the the “degrees” of murder etc are about.
Disclaimer: I’m not a lawyer. I do read an awful lot of contacts and talk to lawyers.
This is not a legal text, you little cheat.
This is a sentence in natural language, want me to start asking such questions about everything you write?
If you make a deepfake of someone and share it, then it’s defamation. Taking a picture voluntarily shared and editing it is not a crime.
I don’t think you’re on the right track here. There are definitely existing laws in most states regarding ‘revenge porn’, creating sexual media of minors, Photoshop porn, all kinds of things that are very similar to ai generated deep fakes. In some cases ai deepfakes fall under existing laws, but often they don’t. Or, because of how the law is written, they exist in a legal grey area that will be argued in the courts for years.
Nowhere is anyone suggesting that making deepfakes should be prosecuted as rape, that’s just complete nonsense. The question is, where do new laws need to be written, or laws need to be updated to make sure ai porn is treated the same as other forms of illegal use of someone’s likeness to make porn.
I’m some jurisdictions, public urination can put you on a sex offender registry.
It wouldn’t even matter if you’re trying to be discreet and just have to go but there’s no public washrooms around.
Nobody was “abused” this is out of hand. suspend the kid or whatever that did it, some kind of school punishment, but jail? And lawsuits over some ai images? Crazy.
The lawsuit was about the fact the school knew for months about the problem and did nothing to address it. If they plausibly couldn’t know, it wouldn’t have been their fault but this was reported to the admin repeatedly and they did nothing.
Exactly this, and rightly so. The school’s administration has a moral and legal obligation to do what it can for the safety of its students, and allowing this to continue unchecked violates both of those obligations.
Is in-person harassment not abuse anymore?
Arstechnica doesn’t cite its sources? All it has are links to more Arstechnica articles.
The above article says,
…but doesn’t cite any sources. There’s an embedded link, back to Arstechnica. What the fuck?
In the article refers to “cops” and “feds”. The overall tone of the writing sounds like a high school student wrote it.
It’s the first link in the article lancasteronline.com/…/article_bfb6066e-a2c7-11ef-…