Twitter Acts Fast on Nonconsensual Nudity If It Thinks It’s a Copyright Violation (www.404media.co)
from 0x0@programming.dev to technology@lemmy.world on 08 Oct 2024 16:18
https://programming.dev/post/20335347

Twitter will remove nonconsensual nude images within hours as long as that media is reported for having violated someone’s copyright. If the same content is reported just as nonconsensual intimate media, Twitter will not remove it within weeks, and might never remove it at all, according to a pre-print study from researchers at the University of Michigan.

#technology

threaded - newest

blackbelt352@lemmy.world on 08 Oct 2024 17:10 next collapse

It sucks that this is the mechanism we have to use for this but a person’s likeness is their own copyright and posting images of someone without permission could be seen as copyright infringement. Granted this also opens a lot of doors to just completely eliminating almost all images from the internet, like imagine going to a tourist destination and having to get permission from anyone who might be in your overdone posed tourist photo.

Edit: Since some of yall are dense motherfuckers and/or just arguing in bad faith, I’m pointing out how going using copyright as the enforcement mechanism opens the door for these already flawed copyright systems to be heavily abused even further. I’m specifically pointing to Right of Publicity, where your likeness is protected from commercial use unless you give permission to post. It’s why any show or movie that’s filmed in a public place blurs people out if they haven’t gotten signed release forms from anyone who appears on camera.

givesomefucks@lemmy.world on 08 Oct 2024 17:15 next collapse

but a person’s likeness is their own copyright and posting images of someone without permission could be seen as copyright infringement

Whut?

timewarp@lemmy.world on 08 Oct 2024 17:25 next collapse

Yeah that just isn’t true. If this was true I could charge every business that has ever stored videos of me.

givesomefucks@lemmy.world on 08 Oct 2024 17:34 next collapse

Make $500k/year just by walking in and out of a Walmart all day!

catloaf@lemm.ee on 08 Oct 2024 17:48 next collapse

It’s true, but in terms of publicity, not mere image capture.

en.wikipedia.org/wiki/Personality_rights

blackbelt352@lemmy.world on 09 Oct 2024 13:24 next collapse

www.law.cornell.edu/wex/publicity

Its the Right to Publicity. Walmart can record security footage but they shouldn’t be able to use a recording for commercial purposes unless you explicitly give them permission to use it.

timewarp@lemmy.world on 09 Oct 2024 16:06 collapse

Yeah, they sell those security videos and are using them for AI training, etc.

communism@lemmy.ml on 09 Oct 2024 17:12 collapse

If they were publicising those videos that sounds illegal to me. If I printed off a copyrighted book for my own personal use, that would be legal. If I started distributing my own reprints of a copyrighted book without permission, the copyright holder could go after me. The businesses can hold copyrighted material without distributing them and not be in breach of the law.

timewarp@lemmy.world on 09 Oct 2024 18:20 collapse

Many of those companies employ use third parties to store those videos and use them to train AI in products that they sell.

blackbelt352@lemmy.world on 09 Oct 2024 13:22 collapse

The Right to Publicity: www.law.cornell.edu/wex/publicity

hedgehog@ttrpg.network on 08 Oct 2024 20:45 next collapse

This isn’t true or how it works, but there is a law being proposed that would sorta make it so: arstechnica.com/…/senates-no-fakes-act-hopes-to-m…

(In the US), your likeness is protected under state laws and due to case law, rather than federal laws, and I don’t know of any such law that imposes a responsibility upon sites like Twitter to take down violations upon your report in the same way that the DMCA does. Rather, they allow you to sue the entity who used your likeness for damages in civil court. That isn’t very useful to Jane when her ex-boyfriend uploads revenge porn of her or to Kate when a random Twitter account deepfakes her face onto a nude.

However, if a picture you have copyright to (like a selfie) is used as an input into an AI, arguably you do have partial copyright to it, as the AI elements are not copyrighted and it could not have been created without your input. As such, I think it would be reasonable to issue a DMCA takedown request if someone posted a nonconsensual deepfake of you, on the grounds that you have a good faith belief that you do have copyright to it. However, if you didn’t take the picture used as an input yourself, you don’t have copyright to it and therefore don’t have partial copyright to the output, either. If it’s a deepfake face swap, then whoever owns copyright of the original scene image/video would also have partial copyright, and they could also issue a DMCA takedown request.

wizardbeard@lemmy.dbzer0.com on 09 Oct 2024 00:05 next collapse

My guy, you seriously aren’t pretending that clothed people in the background of a photo is the same as pictures of someone naked taken or posted without their consent, right?

Just ignoring the core context?

Come on.

blackbelt352@lemmy.world on 09 Oct 2024 13:12 collapse

I’m not making a comparison between the two, I’m pointing out how resolving posting non-consensual nudes of someone through copyright systems could be abused in other instances. I’m also not saying there shouldn’t be a system for having non-consensual nudes taken down, we absolutely should, but it needs to be a system dedicated to taking down non-consensual images, not a patchwork workaround using copyright.

dnick@sh.itjust.works on 09 Oct 2024 22:43 next collapse

There’s no copyright involved in taking a picture of someone, or having a picture of someone… Your tourist pictures are fine. If you publicize then or try selling them, that might be an issue, but making it inconvenient for people to make money off of non-permission photos isn’t really concerning to most people.

blackbelt352@lemmy.world on 10 Oct 2024 07:43 collapse

www.copyright.gov/what-is-copyright/

Read up on how exactly copyright works, as soon as you fix a work in a tangible and communicable form, you have a copyright to it. Taking a nude photo of yourself gives you the exclusove copyright of that photo. Taking a tourist photo does give you copyright to that specific photo, but also doesn’t necessarily supercede another existing copyright if that photo is of something else that already had a copyright.

And depending on jurisdictions, your tourist photos might not be fine. For example, in France, they have very strict privacy laws and copyright enforcement, the Eiffel Tower might be public domain, but the light installation is still under copyright. And any modern buildings designed by an architect who died within the last 70 years is still protected by copyright. And on the privacy front, accidentally taking pictures of other people even in tourist areas could actually open you up to a lawsuit, but nobody’s actually tried that yet so it’s up in the air whether it would hold up.

HelloHotel@lemmy.world on 09 Oct 2024 22:47 collapse

so if Im getting this correct, because zuckerburg runs ads, you can claim the usage is always commercial therefore always subject to copyright control. if you want nudity taken down, you must use (and in the process normalize) this easly abusable loophole that contains absolutely no safeguards.

blackbelt352@lemmy.world on 10 Oct 2024 07:12 collapse

Not what I’m saying. I’m saying using copyright enforcement systems as the workaround to getting non-consenusal nudes taken down from a website is putting even more burden onto already heavily abused systems. That doesn’t have anything to do with the Zucc running ads, it’s because copyright enforcement systems don’t work very well to begin with and are very easily abused by bad actors. It’s not the right tool for the job, and it would be much better to have something specifically dedicated to getting the non-consensual publishing of nude images taken down instead of some bubblegum and twine hack of a solution through copyright enforcement.

HelloHotel@lemmy.world on 10 Oct 2024 09:18 next collapse

Oh. so you are saying they use need to use copyright enforcement tools irreguardless of if it is a valid takedown according to copyright law. NOT that they are trying to invent a legal reason.

HelloHotel@lemmy.world on 30 Oct 05:33 collapse

non existent options for CLI tools

When writing vanila javascript, it writes loke a newbie. Anything remotely objscure and it will make things up.

AceFuzzLord@lemm.ee on 08 Oct 2024 17:43 next collapse

In other words, twatter is probably gonna pull the bullshit where they do business as usual and do nothing until police or any government goes after them.

Doom@ttrpg.network on 08 Oct 2024 17:54 collapse

that’s how all corporations act

sunzu2@thebrainbin.org on 08 Oct 2024 18:59 next collapse

Don't hurt normies too hard bro

match@pawb.social on 10 Oct 2024 08:13 collapse

the structure of corporate law is systematically directed towards this behavior

southsamurai@sh.itjust.works on 09 Oct 2024 00:16 next collapse

Someone needs to cook up a bot that flags every post on Twitter, Facebook, or reddit with a DCMA takedown.

0x0@programming.dev on 09 Oct 2024 08:11 collapse

It’s not unheard of

Confused_Emus@lemmy.dbzer0.com on 09 Oct 2024 17:10 collapse

Paywalled.

Unlock Your Access to Premier Legal Insights with Law.com Become a Law.com Digital Reader for free!

Nah I’m good.

serenissi@lemmy.world on 10 Oct 2024 09:02 collapse

Goes away with js disabled. Rest of page works fine.

Confused_Emus@lemmy.dbzer0.com on 10 Oct 2024 15:16 collapse

Even with JavaScript disabled, I’m only seeing the first paragraph of the article.

serenissi@lemmy.world on 11 Oct 2024 20:31 collapse

It’s opening full for me. Probably geo paywalled…

milicent_bystandr@lemm.ee on 09 Oct 2024 16:20 next collapse

Nonconsensual nudity only hurts people.

Copyright violation hurts profits.

Asafum@feddit.nl on 09 Oct 2024 16:32 next collapse

“Can someone with money sue me? No? Oh they can fuck off then.”

ShaggySnacks@lemmy.myserv.one on 09 Oct 2024 16:53 next collapse

Shocked! That the guy who owns Twitter isn’t making this a priority.

Wait, it all makes sense when the owner, Elon Musk makes a tone deaf joke about impregnating Taylor Swift.

KingThrillgore@lemmy.ml on 10 Oct 2024 15:21 collapse

This confirms my theory that Elon is such a cheapass. He won’t pay for porn, but he’ll jack it to whatever’s in the S3 folder. Regardless of how it got there.