autotldr@lemmings.world
on 24 Apr 2024 17:45
nextcollapse
This is the best summary I could come up with:
Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.
Since early last year, at least two dozen states have introduced bills to combat A.I.-generated sexually explicit images — known as deepfakes — of people under 18, according to data compiled by the National Center for Missing & Exploited Children, a nonprofit organization.
nudification apps is enabling the mass production and distribution of false, graphic images that can potentially circulate online for a lifetime, threatening girls’ mental health, reputations and physical safety.
A lawyer defending a male high school student in a deepfake lawsuit in New Jersey recently argued that the court should not temporarily restrain his client, who had created nude A.I.
Under the new Louisiana law, any person who knowingly creates, distributes, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.
After learning of the incident at Issaquah High from his daughter, Senator Mullet reached out to Representative Orwall, an advocate for sexual assault survivors and a former social worker.
The original article contains 1,288 words, the summary contains 198 words. Saved 85%. I’m a bot and I’m open source!
otp@sh.itjust.works
on 24 Apr 2024 18:11
nextcollapse
The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.
If we allow people to use this tech for adults (which we really shouldn’t), then we have to accept that people will use the same tech on minors. It isn’t even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it’s still something that very obviously shouldn’t be happening.
* we don’t need to get into semantics. I’m just saying it’s not abnormal (the way pedophilia is) for a 15-year old to be attracted to another 15-year old in a sexual way.
Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.
The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them “”“legitimately”“” would want to use it…or to just ban them outright.
vzq@lemmy.blahaj.zone
on 24 Apr 2024 18:14
nextcollapse
That’s a lot of words to defend fake child porn made out of photos and videos of actual children.
NOT_RICK@lemmy.world
on 24 Apr 2024 18:23
nextcollapse
Reading comprehension not a strong suit? Sounds to me they’re arguing for protections for both adults AND minors.
Ok_imagination@lemmy.world
on 24 Apr 2024 19:29
collapse
Words is treacherous bastards
NOT_RICK@lemmy.world
on 24 Apr 2024 19:35
collapse
I don’t always be like that but sometimes it do
Zorque@kbin.social
on 24 Apr 2024 18:30
nextcollapse
That's about the right amount of words to completely ignore the sentiment of a statement so you can make a vapid holier-than-thou statement based on purported moral superiority.
Fosheze@lemmy.world
on 24 Apr 2024 19:02
nextcollapse
Have you tried actually reading what they said instead of just making shit up?
otp@sh.itjust.works
on 24 Apr 2024 22:26
nextcollapse
That’s a lot of words to defend fake child porn made out of photos and videos of actual children.
Uh…this is the second sentence or so (and the start of the second paragraph, I think)
If we allow people to use this tech for adults (which we really shouldn’t)
So I’m not sure where you got the idea that I’m defending AI-generated child porn.
Unless you’re so adamant about AI porn generators existing that banning their usage on adults (or invading the privacy of the users and victims with oversight) is outright unthinkable? Lol
I’m saying that IF the technology exists, people will be using it on pictures of children. We need to keep that in mind when we think about laws for this stuff. It’s not just adults uploading pictures of themselves (perfectly fine) or adult celebrities (not fine, but probably more common than any truly acceptable usage).
micka190@lemmy.world
on 24 Apr 2024 21:13
collapse
such as when the person making them is also a minor
I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.
Zorque@kbin.social
on 24 Apr 2024 22:15
nextcollapse
Which is more of a "zero-tolerance" policy, like unto giving the same punishment to a student defending themselves as the one given to the person who initiated the attack.
otp@sh.itjust.works
on 24 Apr 2024 22:19
nextcollapse
I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.
I agree, and on the one hand, I understand why it could be good to consider it illegal (to prevent child porn from existing), but it does also seem silly to treat it as a case of pedophilia.
atrielienz@lemmy.world
on 27 Apr 2024 15:37
collapse
Not just silly. Extremely damaging. We don’t even treat most other crimes minors commit this way. Records can often be expunged for other crimes. At the age of 18 they are generally sealed. But not in this case.
This is the government doing a bad job of regulating technology they do not fully understand the scope of in an attempt to save the children by punishing them sometimes for life. Over what essentially amounts to heavy flirting between people of their own age group.
Child porn is not okay and it should be illegal. But the law cannot always be applied in a way that is equal because a kid sending another kid a nude of themselves is not the same as an adult using the nude of a child for sexual gratification or excitement. One of those things is a natural normal thing. The other is extremely reprehensible and damaging to the victims used to create those images.
We have sex offender registries that are for serious crimes where people can’t live close to schools, need to be monitored, etc…examples of such crimes include…
Rape
Sexual assault
Urinating outside
Sending a solicited nude to a classmate
BrianTheeBiscuiteer@lemmy.world
on 24 Apr 2024 22:54
collapse
And that’s still a bit messed up. It’s a felony for a teen to have nude pictures of themselves and they’ll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a “friend” to pedophiles.
gravitas_deficiency@sh.itjust.works
on 25 Apr 2024 11:21
collapse
It does seem a bit heavy handed when the context is just two high schoolers tryna smash.
NightAuthor@lemmy.world
on 25 Apr 2024 12:43
nextcollapse
They’re both pedos and should be locked up for life.
micka190@lemmy.world
on 25 Apr 2024 20:21
collapse
The issue is that the picture then exists, and it’s hard to prove it was actually destroyed.
For example, when I was in high school, a bunch of girls would send nudes to guys. But that was 10 years ago. Those pictures still exist. Those dudes aren’t minors anymore. Their Messenger chats probably still exist somewhere. Nothing’s really preventing them from looking at those pictures again.
I get why it’s illegal. And, honestly, I find it kind of weird that there’s people trying to justify why it shouldn’t be illegal. You’re still allowed to have sex at that age. Just don’t take pictures/videos of it.
BrianTheeBiscuiteer@lemmy.world
on 03 May 2024 13:10
collapse
That makes complete sense except that stuff just does not register with teens. If a couple months in juvenile hall and 100 hours community service isn’t enough deterrent for a teenager then 5 years in jail and a lifelong label of “sex offender” won’t deter them. I recall seeing a picture of a classmate topless (under 18) and over 20 years later it finally dawned on me that it was child pornography.
If we prosecuted every offender to the full extent of the law then like half of every high school class would be in jail. Not to say that something should be legal as long as enough people are breaking the law but if millions of kids are violating some of the strictest laws in the country we’re probably not getting the full picture.
WhyDoYouPersist@lemmy.world
on 24 Apr 2024 18:12
nextcollapse
For some reason I thought it was mainly to protect Taylor Swift, with teen girls being the afterthought.
Imgonnatrythis@sh.itjust.works
on 24 Apr 2024 22:12
nextcollapse
Won’t somebody please think of Taylor?!
GreyEyedGhost@lemmy.ca
on 25 Apr 2024 07:55
collapse
But not that way…
coarse@startrek.website
on 25 Apr 2024 16:24
collapse
Rich families sometimes also have teen girls among them.
themeatbridge@lemmy.world
on 24 Apr 2024 18:39
nextcollapse
No reason not to ban them entirely.
The problem is enforcing the ban. Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files? It would be trivial to host a site in a country without legal protections and make the software available from anywhere.
Open source image generators already exist and have been widely disseminated worldwide.
So all you'd end up doing is putting up a roadblock for legitimate uses. Anybody using it to cause harm will not be seriously impeded. They can just pick up the software from a Russian/Chinese/EU host or less official distribution methods.
It would be as effective as the US trying to outlaw the exporting of strong encryption standards in the 90s. That is to say, completely ineffective and actually harmful. Enemies of the US were still using strong encryption anyway.
mynamesnotrick@lemmy.zip
on 24 Apr 2024 21:33
collapse
I feel like a sensible realistic course of action with this is it needs to be in the act of sharing/distributing. It would be way to broad otherwise as the tools that generate this stuff have unlimited purposes. Obvious child situations should be dealt with in the act of production of but the enforcement mechanism needs to be on the sharing/distribution part. Unfortunately analogy is blame the person not the tool on this one.
Right. And honestly, this should already be covered under existing harassment laws.
themeatbridge@lemmy.world
on 25 Apr 2024 00:02
collapse
Yeah, I feel like if you find this shit on someone’s computer, whether they shared it or not, there should be some consequences. Court-mandated counseling at a minimum.
peanuts4life@lemmy.blahaj.zone
on 24 Apr 2024 22:23
nextcollapse
Good!
Buelldozer@lemmy.today
on 24 Apr 2024 22:46
nextcollapse
I don’t know why you got down voted, youre right. This is going to be ridiculously hard to enforce
coarse@startrek.website
on 25 Apr 2024 16:26
collapse
I downvoted him because it’s not a good thing and going further to ban the software is even worse.
This is a waste of resources that could be better spent helping humanity. Instead, we use it to protect affluent white girls from their own insecurity.
ristoril_zip@lemmy.zip
on 25 Apr 2024 00:10
nextcollapse
This genie is probably impossible to get back in the bottle.
People are going to just direct the imitative so called AI program to make the face just different enough to have plausible deniability that it’s a fake of this person or that person. Or use existing tech to age them to 18+ (or 30+ or whatever). Or darken or lighten their skin or change their eye or hair color. Or add tattoos or piercings or scars…
I’m not saying we should be happy about it, but it is here and I don’t think it’s going anywhere. Like, if you tell your so called AI to give you a completely fictional nude image or animation of someone that looks similar to Taylor Swift but isn’t Taylor Swift, what’s the privacy (or other) violation, exactly?
Does Taylor Swift own every likeness that looks somewhat like hers?
PM_Your_Nudes_Please@lemmy.world
on 25 Apr 2024 04:33
nextcollapse
It’s also not a new thing. It’s just suddenly much easier for the layman to do. Previously, you needed some really good photoshop skills to pull it off. But you could make fake nudes if you really wanted to, and were willing to put in the time and effort.
FlyingSquid@lemmy.world
on 25 Apr 2024 12:24
nextcollapse
This does give prosecutors a new angle though. So it’s not for nothing.
coarse@startrek.website
on 25 Apr 2024 16:22
collapse
Yay, more things to waste money on.
TORFdot0@lemmy.world
on 25 Apr 2024 12:40
collapse
If the prompt includes “Taylor swift” or an image of her. Then it doesn’t matter if the AI slightly changed it, it used her likeness to generate the image and so she should have rights to the image and the ability to claim damages.
The same thing should apply to using deepfake porn AIs to make non consensual nudes of private person, or heck manually creating nonconsensual deepfake nudes should also fall under the same definition
Saik0Shinigami@lemmy.saik0.com
on 25 Apr 2024 13:33
collapse
This is not how it works. Paparazzi that take her image own the rights to the image. Not Taylor Swift. They make the money on the image when they sell it and Taylor Swift gets nothing out of the sale and has no rights on that transaction. If you’re in public you can be photographed. If a photographer takes an image and releases it to public domain, the subjects of the image will have no say in it unless the photographer broke some other law. (Eg peeping Tom laws or stalking)
jacksilver@lemmy.world
on 26 Apr 2024 16:35
collapse
I believe that your statements are only true for public figures. I’m pretty sure non-public figures retain the right to photos of themselves (unless they aren’t the main subject in the photograph).
Saik0Shinigami@lemmy.saik0.com
on 26 Apr 2024 17:27
collapse
Negative. Go take headshots at a photo place. You don’t have the right to make copies of your own headshot without permission from that photo place. Your own headshot would literally be you as the primary subject. Yet you still don’t have rights to it unless your contract with that photographer says otherwise.
In your own link the first answers even states it…
The photographer is normally the sole owner of the copyright in the photograph.
Subjects having any rights to the photo is rare, short of other laws being broken.
Edit: Hell my own kids school pictures. I have to purchase a digital copy of the photo to get a release from the company to make my own prints. EVEN ON MY OWN PRINTER.
jacksilver@lemmy.world
on 26 Apr 2024 17:59
collapse
Sorry my bad, I was speaking to pictures taken in a public setting, but didn’t clarify. When you get headshots done you are giving the photographer the rights.
Saik0Shinigami@lemmy.saik0.com
on 26 Apr 2024 18:10
collapse
Generally, you can take any photos you want of people when they are in a public location, like a park, a beach or a city square. It’s perfectly legal since they have elected to place themselves in a public location and have no reasonable expectation of privacy. If you snap a hundred pictures of people at a political rally, a marathon or a rock concert in the park, all is well and good.
[…]
For example, if you photograph a couple kissing on the beach and publish the photo in the newspaper, they cannot complain. They have no claim against you even if one of the two happens to be married to someone else and the marriage ends because of the photo.
So even though that couple is the direct foreground subject of the image, the photographer is NOT liable for not only taking the picture, publishing the picture, but ALSO any damages the picture caused by being published. This is why the paparazzi are also protected.
In the previous post the photographer has the rights because it’s their photo, not because you’re giving them any rights.
Edit: Typo
jacksilver@lemmy.world
on 26 Apr 2024 19:53
collapse
Taking photos and the right for commercial use of the photos are two different things. The reason why film crews/photographers generally ask for people to sign releases is because it’s not clear cut. While the US is generally more forgiving, it’s not a guarantee.
Saik0Shinigami@lemmy.saik0.com
on 26 Apr 2024 20:52
collapse
Right… So back to the topic discussion rather than adding extra shit… Someone taking pictures and putting it through AI… There’s no problem. They own the rights to that photo and all derivative works (except for any cases where it outright violates a law, peeping tom/stalking/etc…). Public figure or not.
After that it can get gray (but I never brought sale or commercial AI use as a thing… Not sure why people assume I did). But it’s quite rare where a sold picture cause a photographer problems. Even if the subjects didn’t necessarily consent.
Some other countries might have problems with that and have different laws on the books. But at this point in the world it’s really not hard to have a shell company in a territory/country that doesn’t have such laws… Then it no longer matters again. Good like finding the photographer to sue.
TheFriar@lemm.ee
on 25 Apr 2024 00:27
nextcollapse
That title is…misleading. Why start it that way?
FenrirIII@lemmy.world
on 25 Apr 2024 01:15
nextcollapse
It does sound off. But, then again, these are politicians, so it could go either way.
sugar_in_your_tea@sh.itjust.works
on 25 Apr 2024 03:47
collapse
Son_of_dad@lemmy.world
on 25 Apr 2024 05:28
collapse
Hey we said no deepfakes!
hexdream@lemmy.world
on 25 Apr 2024 01:01
nextcollapse
And no chance it’s because they want to, uh, thoroughly investigate the evidence…
NutWrench@lemmy.world
on 25 Apr 2024 12:07
nextcollapse
“We’re gonna ban Internet stuff” is something said by people who have no idea how the Internet works.
pro_grammer@programming.dev
on 25 Apr 2024 12:32
collapse
They probably do this to satisfy voters who also don’t know how the internet works.
Daft_ish@lemmy.world
on 25 Apr 2024 12:24
nextcollapse
This is probably not the best context but I find it crazy how fast the government will get involved if it involves lude content but children are getting mudered in school shootings and gun control is just a bridge too far.
pro_grammer@programming.dev
on 25 Apr 2024 12:31
nextcollapse
I think they act faster on those matters because, aside from being a very serious problem, they also have a conservative agenda.
Is very easy to say: “LOOK, WE ARE DOING THIS TO PROTECT YOUR CHILDREN FROM PEDOPHILES!!!”
But they can’t just go and say “let’s enforce gun safety on schools”, because having a conservative voter reading “gun safety” will already go bad for them.
They know they are sacrificing the well being of children by not acting on the school shootings, but for them is just the price of a few lives to stay in power.
coarse@startrek.website
on 25 Apr 2024 16:21
nextcollapse
That’s because it’s a problem that affects rich people.
Psythik@lemmy.world
on 26 Apr 2024 15:43
nextcollapse
Are quaaludes even still available in 2024?
Or did you mean to say “lewd”?
Daft_ish@lemmy.world
on 26 Apr 2024 16:06
collapse
If only their were context clues… oh wait you’re just being a jerk.
NerdyApex@lemmynsfw.com
on 26 Apr 2024 16:15
nextcollapse
Did you mean “there were”?
Daft_ish@lemmy.world
on 26 Apr 2024 16:17
nextcollapse
Sorry can’t help it; I’m an energy vampire and we tend to be jerks. Got it from my dad.
EatATaco@lemm.ee
on 26 Apr 2024 16:51
nextcollapse
There’s no competing interests when it comes to protecting child from child sexual exploitation. When it comes to protecting them from guns, there is the competing interest of the second amendment.
Grandwolf319@sh.itjust.works
on 26 Apr 2024 20:40
collapse
Those shootings don’t happen in private schools.
Nudes happen in private schools.
coarse@startrek.website
on 25 Apr 2024 16:21
nextcollapse
Spurred by rich people*
We don’t solve problems that only affect poor people.
boatsnhos931@lemmy.world
on 27 Apr 2024 19:22
collapse
My digital bobs and vagene is very special I’ll have you know
threaded - newest
This is the best summary I could come up with:
Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.
Since early last year, at least two dozen states have introduced bills to combat A.I.-generated sexually explicit images — known as deepfakes — of people under 18, according to data compiled by the National Center for Missing & Exploited Children, a nonprofit organization.
nudification apps is enabling the mass production and distribution of false, graphic images that can potentially circulate online for a lifetime, threatening girls’ mental health, reputations and physical safety.
A lawyer defending a male high school student in a deepfake lawsuit in New Jersey recently argued that the court should not temporarily restrain his client, who had created nude A.I.
Under the new Louisiana law, any person who knowingly creates, distributes, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.
After learning of the incident at Issaquah High from his daughter, Senator Mullet reached out to Representative Orwall, an advocate for sexual assault survivors and a former social worker.
The original article contains 1,288 words, the summary contains 198 words. Saved 85%. I’m a bot and I’m open source!
The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.
If we allow people to use this tech for adults (which we really shouldn’t), then we have to accept that people will use the same tech on minors. It isn’t even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it’s still something that very obviously shouldn’t be happening.
* we don’t need to get into semantics. I’m just saying it’s not abnormal (the way pedophilia is) for a 15-year old to be attracted to another 15-year old in a sexual way.
Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.
The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them “”“legitimately”“” would want to use it…or to just ban them outright.
That’s a lot of words to defend fake child porn made out of photos and videos of actual children.
Reading comprehension not a strong suit? Sounds to me they’re arguing for protections for both adults AND minors.
Words is treacherous bastards
I don’t always be like that but sometimes it do
That's about the right amount of words to completely ignore the sentiment of a statement so you can make a vapid holier-than-thou statement based on purported moral superiority.
Have you tried actually reading what they said instead of just making shit up?
But I want to be outraged now!
Uh…this is the second sentence or so (and the start of the second paragraph, I think)
So I’m not sure where you got the idea that I’m defending AI-generated child porn.
Unless you’re so adamant about AI porn generators existing that banning their usage on adults (or invading the privacy of the users and victims with oversight) is outright unthinkable? Lol
I’m saying that IF the technology exists, people will be using it on pictures of children. We need to keep that in mind when we think about laws for this stuff. It’s not just adults uploading pictures of themselves (perfectly fine) or adult celebrities (not fine, but probably more common than any truly acceptable usage).
What a dumb take. And I do those myself, so I know one if I see one.
I hope “those” refers to the dumb takes and not the nude photos of minors
Yeah for sure
I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.
Which is more of a "zero-tolerance" policy, like unto giving the same punishment to a student defending themselves as the one given to the person who initiated the attack.
I agree, and on the one hand, I understand why it could be good to consider it illegal (to prevent child porn from existing), but it does also seem silly to treat it as a case of pedophilia.
Not just silly. Extremely damaging. We don’t even treat most other crimes minors commit this way. Records can often be expunged for other crimes. At the age of 18 they are generally sealed. But not in this case.
This is the government doing a bad job of regulating technology they do not fully understand the scope of in an attempt to save the children by punishing them sometimes for life. Over what essentially amounts to heavy flirting between people of their own age group.
Child porn is not okay and it should be illegal. But the law cannot always be applied in a way that is equal because a kid sending another kid a nude of themselves is not the same as an adult using the nude of a child for sexual gratification or excitement. One of those things is a natural normal thing. The other is extremely reprehensible and damaging to the victims used to create those images.
That’s a fair point.
We have sex offender registries that are for serious crimes where people can’t live close to schools, need to be monitored, etc…examples of such crimes include…
And that’s still a bit messed up. It’s a felony for a teen to have nude pictures of themselves and they’ll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a “friend” to pedophiles.
It does seem a bit heavy handed when the context is just two high schoolers tryna smash.
They’re both pedos and should be locked up for life.
The issue is that the picture then exists, and it’s hard to prove it was actually destroyed.
For example, when I was in high school, a bunch of girls would send nudes to guys. But that was 10 years ago. Those pictures still exist. Those dudes aren’t minors anymore. Their Messenger chats probably still exist somewhere. Nothing’s really preventing them from looking at those pictures again.
I get why it’s illegal. And, honestly, I find it kind of weird that there’s people trying to justify why it shouldn’t be illegal. You’re still allowed to have sex at that age. Just don’t take pictures/videos of it.
That makes complete sense except that stuff just does not register with teens. If a couple months in juvenile hall and 100 hours community service isn’t enough deterrent for a teenager then 5 years in jail and a lifelong label of “sex offender” won’t deter them. I recall seeing a picture of a classmate topless (under 18) and over 20 years later it finally dawned on me that it was child pornography.
If we prosecuted every offender to the full extent of the law then like half of every high school class would be in jail. Not to say that something should be legal as long as enough people are breaking the law but if millions of kids are violating some of the strictest laws in the country we’re probably not getting the full picture.
For some reason I thought it was mainly to protect Taylor Swift, with teen girls being the afterthought.
Won’t somebody please think of Taylor?!
But not that way…
Rich families sometimes also have teen girls among them.
No reason not to ban them entirely.
The problem is enforcing the ban. Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files? It would be trivial to host a site in a country without legal protections and make the software available from anywhere.
Problem with the former is that would outlaw any self hosted image generator. Any image generator is capable of use for deep fake porn
Right, this is my point. The toothpaste is out of the tube. So would simply having the software capable of making deepfake porn be a crime?
Perhaps an unpopular opinion, but I’d be fine with that. I have yet to see a benefit or possible benefit that outweighs the costs.
The problem is the cat's out of the bag.
Open source image generators already exist and have been widely disseminated worldwide.
So all you'd end up doing is putting up a roadblock for legitimate uses. Anybody using it to cause harm will not be seriously impeded. They can just pick up the software from a Russian/Chinese/EU host or less official distribution methods.
It would be as effective as the US trying to outlaw the exporting of strong encryption standards in the 90s. That is to say, completely ineffective and actually harmful. Enemies of the US were still using strong encryption anyway.
I feel like a sensible realistic course of action with this is it needs to be in the act of sharing/distributing. It would be way to broad otherwise as the tools that generate this stuff have unlimited purposes. Obvious child situations should be dealt with in the act of production of but the enforcement mechanism needs to be on the sharing/distribution part. Unfortunately analogy is blame the person not the tool on this one.
Right. And honestly, this should already be covered under existing harassment laws.
Yeah, I feel like if you find this shit on someone’s computer, whether they shared it or not, there should be some consequences. Court-mandated counseling at a minimum.
Good!
Archive link - archive.ph/SVJtX
probably a good thing but just banning something doesnt do anything - you have to enforce it by getting rid of the software & then keep enforcing it
I don’t know why you got down voted, youre right. This is going to be ridiculously hard to enforce
I downvoted him because it’s not a good thing and going further to ban the software is even worse.
This is a waste of resources that could be better spent helping humanity. Instead, we use it to protect affluent white girls from their own insecurity.
This genie is probably impossible to get back in the bottle.
People are going to just direct the imitative so called AI program to make the face just different enough to have plausible deniability that it’s a fake of this person or that person. Or use existing tech to age them to 18+ (or 30+ or whatever). Or darken or lighten their skin or change their eye or hair color. Or add tattoos or piercings or scars…
I’m not saying we should be happy about it, but it is here and I don’t think it’s going anywhere. Like, if you tell your so called AI to give you a completely fictional nude image or animation of someone that looks similar to Taylor Swift but isn’t Taylor Swift, what’s the privacy (or other) violation, exactly?
Does Taylor Swift own every likeness that looks somewhat like hers?
It’s also not a new thing. It’s just suddenly much easier for the layman to do. Previously, you needed some really good photoshop skills to pull it off. But you could make fake nudes if you really wanted to, and were willing to put in the time and effort.
This does give prosecutors a new angle though. So it’s not for nothing.
Yay, more things to waste money on.
If the prompt includes “Taylor swift” or an image of her. Then it doesn’t matter if the AI slightly changed it, it used her likeness to generate the image and so she should have rights to the image and the ability to claim damages.
The same thing should apply to using deepfake porn AIs to make non consensual nudes of private person, or heck manually creating nonconsensual deepfake nudes should also fall under the same definition
This is not how it works. Paparazzi that take her image own the rights to the image. Not Taylor Swift. They make the money on the image when they sell it and Taylor Swift gets nothing out of the sale and has no rights on that transaction. If you’re in public you can be photographed. If a photographer takes an image and releases it to public domain, the subjects of the image will have no say in it unless the photographer broke some other law. (Eg peeping Tom laws or stalking)
I believe that your statements are only true for public figures. I’m pretty sure non-public figures retain the right to photos of themselves (unless they aren’t the main subject in the photograph).
Stackexchange conversation about this.
Negative. Go take headshots at a photo place. You don’t have the right to make copies of your own headshot without permission from that photo place. Your own headshot would literally be you as the primary subject. Yet you still don’t have rights to it unless your contract with that photographer says otherwise.
avvo.com/…/who-owns-the-copyrights-for-headshots-…
In your own link the first answers even states it…
Subjects having any rights to the photo is rare, short of other laws being broken.
Edit: Hell my own kids school pictures. I have to purchase a digital copy of the photo to get a release from the company to make my own prints. EVEN ON MY OWN PRINTER.
Sorry my bad, I was speaking to pictures taken in a public setting, but didn’t clarify. When you get headshots done you are giving the photographer the rights.
Still negative.
legalbeagle.com/8581945-illegal-pictures-people-p…
[…]
So even though that couple is the direct foreground subject of the image, the photographer is NOT liable for not only taking the picture, publishing the picture, but ALSO any damages the picture caused by being published. This is why the paparazzi are also protected.
In the previous post the photographer has the rights because it’s their photo, not because you’re giving them any rights.
Edit: Typo
Taking photos and the right for commercial use of the photos are two different things. The reason why film crews/photographers generally ask for people to sign releases is because it’s not clear cut. While the US is generally more forgiving, it’s not a guarantee.
More details
Right… So back to the topic discussion rather than adding extra shit… Someone taking pictures and putting it through AI… There’s no problem. They own the rights to that photo and all derivative works (except for any cases where it outright violates a law, peeping tom/stalking/etc…). Public figure or not.
After that it can get gray (but I never brought sale or commercial AI use as a thing… Not sure why people assume I did). But it’s quite rare where a sold picture cause a photographer problems. Even if the subjects didn’t necessarily consent.
Some other countries might have problems with that and have different laws on the books. But at this point in the world it’s really not hard to have a shell company in a territory/country that doesn’t have such laws… Then it no longer matters again. Good like finding the photographer to sue.
That title is…misleading. Why start it that way?
It does sound off. But, then again, these are politicians, so it could go either way.
<img alt="Sooo good!" src="https://images2.fanpop.com/images/photos/3100000/Teen-Girl-Squad-teen-girl-squad-3100313-537-382.jpg">
Hey we said no deepfakes!
And no chance it’s because they want to, uh, thoroughly investigate the evidence…
“We’re gonna ban Internet stuff” is something said by people who have no idea how the Internet works.
They probably do this to satisfy voters who also don’t know how the internet works.
This is probably not the best context but I find it crazy how fast the government will get involved if it involves lude content but children are getting mudered in school shootings and gun control is just a bridge too far.
I think they act faster on those matters because, aside from being a very serious problem, they also have a conservative agenda.
Is very easy to say: “LOOK, WE ARE DOING THIS TO PROTECT YOUR CHILDREN FROM PEDOPHILES!!!”
But they can’t just go and say “let’s enforce gun safety on schools”, because having a conservative voter reading “gun safety” will already go bad for them.
They know they are sacrificing the well being of children by not acting on the school shootings, but for them is just the price of a few lives to stay in power.
That’s because it’s a problem that affects rich people.
Are quaaludes even still available in 2024?
Or did you mean to say “lewd”?
If only their were context clues… oh wait you’re just being a jerk.
Did you mean “there were”?
Oh no I’ave disturbed the pedants.
If only they’re wear context clues…
Sorry can’t help it; I’m an energy vampire and we tend to be jerks. Got it from my dad.
There’s no competing interests when it comes to protecting child from child sexual exploitation. When it comes to protecting them from guns, there is the competing interest of the second amendment.
Those shootings don’t happen in private schools.
Nudes happen in private schools.
Spurred by rich people*
We don’t solve problems that only affect poor people.
My digital bobs and vagene is very special I’ll have you know