‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity (time.com)
from L4s@lemmy.world to technology@lemmy.world on 08 Dec 2023 12:00
https://lemmy.world/post/9300034

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

#technology

threaded - newest

Imgonnatrythis@sh.itjust.works on 08 Dec 2023 12:14 next collapse

I use an ad blocker and haven’t seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?

NeoNachtwaechter@lemmy.world on 08 Dec 2023 12:20 next collapse

Careful with asking such things, because the boundary to crime seems blurry.

chitak166@lemmy.world on 08 Dec 2023 14:58 collapse

I don’t think there is any crime.

It’s identical to drawing a nude picture of someone.

NeoNachtwaechter@lemmy.world on 08 Dec 2023 15:15 next collapse

And you are sure that ‘someone’ is of legal age, of course. Not blaming you. But does everybody always know that ‘someone’ is of legal age? Just an example to start thinking.

chitak166@lemmy.world on 08 Dec 2023 15:18 collapse

I don’t know if it’s illegal to create naked drawings of people who are underage.

EatYouWell@lemmy.world on 08 Dec 2023 18:05 next collapse

It’s not

andros_rex@lemmy.world on 08 Dec 2023 18:40 collapse

Depends on where you live. Not legal in the UK for example. In the US it can even be broken down at the state level, although there’s lots of debate on whether states are able to enforce their laws. “Obscene” speech is not protected under free speech, the argument would be whether or not the naked drawings had artistic merit or not.

I’m not a lawyer, but I do know that people in the US have gone to prison for possessing naked images of fictional children and it’s on the books as illegal in many other countries.

Tyfud@lemmy.one on 08 Dec 2023 16:01 collapse

It’s what the courts think, and right now, it’s not clear what the enforceable laws are here. There’s a very real chance people who do this will end up in jail.

I believe prosecutors are already filling cases about this. The next year will decide the fate of these AI generator deepfakes and the memories behind them.

cheese_greater@lemmy.world on 08 Dec 2023 12:32 next collapse

That’s disgusting, where are these nude photo sites so I can avoid them? There’s so MANY, but which one?!

PipedLinkBot@feddit.rocks on 08 Dec 2023 12:32 collapse

Here is an alternative Piped link(s):

That’s disgusting, where are these nude photo sites so I can avoid them? There’s so MANY, but which one?!

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

jivandabeast@lemmy.browntown.dev on 08 Dec 2023 14:35 next collapse

Sus question lmfao

These things have been around since the onset of deepfakes, and truly if you take a couple seconds to look you’ll find them. It’s a massive issue and the content is everywhere

helenslunch@feddit.nl on 08 Dec 2023 14:51 collapse

This has been around in some form way before deepfakes

jivandabeast@lemmy.browntown.dev on 08 Dec 2023 15:08 collapse

We’re talking specifically about AI enhanced fakes, not the old school Photoshop fakes – they’re two completely different beasts

MotoAsh@lemmy.world on 08 Dec 2023 15:50 collapse

Different only in construction. Why they exist and what they are is older than photography.

jivandabeast@lemmy.browntown.dev on 08 Dec 2023 16:02 collapse

No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing

MotoAsh@lemmy.world on 08 Dec 2023 16:10 next collapse

That is a quality improvement, not a shift in nature.

jivandabeast@lemmy.browntown.dev on 08 Dec 2023 18:53 next collapse

I’m not saying that it’s a shift in nature? All I’ve been saying is:

A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing

B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects

Delta_V@midwest.social on 08 Dec 2023 22:15 next collapse

Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.

barsoap@lemm.ee on 09 Dec 2023 09:48 collapse

The difference is that we now can do video. I mean in principle that was possible before but also a hell of a lot of work. Making it look real hasn’t been a problem since before Photoshop, if anything people get sloppy with AI also because a felt 99% of people who use AI don’t have an artistic bone in their body.

lolcatnip@reddthat.com on 08 Dec 2023 17:03 collapse

There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.

SendMePhotos@lemmy.world on 09 Dec 2023 02:17 collapse

I don’t understand either.

Tylerdurdon@lemmy.world on 08 Dec 2023 12:18 next collapse

You mean men envision women naked? And now there’s an app that’s just as perverted? Huh

lolcatnip@reddthat.com on 08 Dec 2023 17:06 collapse

What’s perverted about someone envisioning a potential sexual partner naked? That seems incredibly normal to me.

pl_woah@lemmy.ml on 08 Dec 2023 18:15 collapse

Maybe revenge porn and creating deepfake porn of the girl from social studies is wrong?

snekerpimp@lemmy.world on 08 Dec 2023 12:20 next collapse

“But the brightest minds of the time were working on other things like hair loss and prolonging erections.”

cheese_greater@lemmy.world on 08 Dec 2023 12:33 next collapse

We have the technology

SinningStromgald@lemmy.world on 08 Dec 2023 12:39 next collapse

So we can all have big hairy erections like God intended.

RanchOnPancakes@lemmy.world on 08 Dec 2023 13:27 next collapse

wait, then why am I still slowly balding?

abeojeht@lemmynsfw.com on 08 Dec 2023 14:11 collapse

Slowly means they succeeded

RanchOnPancakes@lemmy.world on 10 Dec 2023 03:53 collapse

no

Gregorech@lemmy.world on 08 Dec 2023 18:38 next collapse

To be fair, the erection thing was a fluke. Once they found it though it was the fastest FDA approval in decades.

ColeSloth@discuss.tchncs.de on 09 Dec 2023 01:38 next collapse

BWANDO!

snekerpimp@lemmy.world on 09 Dec 2023 02:09 collapse

It’s got what plants crave

sunbeam60@lemmy.one on 09 Dec 2023 19:28 collapse

Obligatory Mitchell & Webb sketch: youtu.be/hdHFmc9oiKY?si=God9P5TdA3UEyx47

PipedLinkBot@feddit.rocks on 09 Dec 2023 19:29 collapse

Here is an alternative Piped link(s):

https://piped.video/hdHFmc9oiKY?si=God9P5TdA3UEyx47

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

JackGreenEarth@lemm.ee on 08 Dec 2023 12:22 next collapse

But are there apps that undress men?

cheese_greater@lemmy.world on 08 Dec 2023 12:33 next collapse

Only one way to find out

theKalash@feddit.ch on 08 Dec 2023 12:34 next collapse

Aren’t those just normal chat apps where you can send pictures?

BetaDoggo_@lemmy.world on 08 Dec 2023 12:39 next collapse

The models they’re using are probably capable of both, they just need to change the prompt.

tja@sh.itjust.works on 08 Dec 2023 13:07 collapse

They would need to be trained to do that first.

prettybunnys@sh.itjust.works on 08 Dec 2023 13:41 next collapse

… but why male models?

Scubus@sh.itjust.works on 08 Dec 2023 13:47 collapse

Blue Steel!

PRUSSIA_x86@lemmy.world on 08 Dec 2023 19:15 collapse

9gag

<img alt="" src="https://lemmy.world/pictrs/image/b64fbba5-094d-4e89-8d2e-8f17f5e475fd.png">

Scubus@sh.itjust.works on 09 Dec 2023 00:52 collapse

Same, I couldn’t find any other sources for the image I needed tho

BetaDoggo_@lemmy.world on 08 Dec 2023 14:41 collapse

I don’t think any of the models they’re using are trained from scratch. It would be much cheaper to take something like Stable Diffusion and finetune it or use one of the hundreds of premade porn finetunes that already exist.

EatYouWell@lemmy.world on 08 Dec 2023 18:08 collapse

Yeah, training a FM from scratch is stupid expensive. You’re talking about a team of data scientists each pulling at least 6 figures.

helenslunch@feddit.nl on 08 Dec 2023 14:51 next collapse

No you just politely ask them

chitak166@lemmy.world on 08 Dec 2023 14:59 collapse

Pretty sure straight-men are the most likely ones to use apps like this.

Blamemeta@lemm.ee on 08 Dec 2023 12:34 next collapse

Weirdos. Back in my day, we woild cut out a nude body from playboy and glue it on a picture of Kathleen Turner, and we did uphill both ways in the snow! Darn kids and their technology!

OldWoodFrame@lemm.ee on 08 Dec 2023 12:50 collapse

When I was a kid you had to use the Sears catalog

lud@lemm.ee on 08 Dec 2023 16:16 collapse

Direct image link (instead of a Google link): imgur.com/qFItKA9.jpg

Peter1986C@lemmings.world on 10 Dec 2023 13:41 collapse

Saving people a click:

<img alt="" src="https://i.imgur.com/qFItKA9.jpeg">

Formatting example based on what I did above:

![](https://i.imgur.com/qFItKA9.jpeg)

originalfrozenbanana@lemm.ee on 08 Dec 2023 12:51 next collapse

That the chat is full of people defending this is disgusting. This is different than cutting someone’s face out of a photo and pasting it on a magazine nude or imagining a person naked. Deepfakes can be difficult to tell apart from real media. This enables the spread of nonconsensual pornography that an arbitrary person cannot necessarily tell is fake. Even if it were easy to tell, it’s an invasion of privacy to use someone’s likeness against their will without their consent for the purposes you’re using it for.

The fediverse’s high expectations for privacy seem to go right out the window when violating it gets their dick hard. We should be better.

ABCDE@lemmy.world on 08 Dec 2023 13:03 next collapse

What are you arguing with here? No one is saying that. Stop looking for trouble, it’s weird.

echo64@lemmy.world on 08 Dec 2023 13:09 next collapse

lemmy.world/comment/5895283

ABCDE@lemmy.world on 08 Dec 2023 13:14 collapse

Great, a joke post. That’s what you’re angry about?

“Full of” my arse.

echo64@lemmy.world on 08 Dec 2023 14:34 collapse

I’m a different person, i’m just giving you a link, heres another tho lemmy.world/comment/5896051

Guest_User@lemmy.world on 08 Dec 2023 13:22 next collapse

Very much “old man yells at clouds” vibe from them

jivandabeast@lemmy.browntown.dev on 08 Dec 2023 14:37 next collapse

Nah tbh there are a few comments on this post asking for links to the tools

originalfrozenbanana@lemm.ee on 08 Dec 2023 15:09 collapse

They were when the post was new. Things change as more people come in.

ABCDE@lemmy.world on 08 Dec 2023 18:03 collapse

No there weren’t.

originalfrozenbanana@lemm.ee on 08 Dec 2023 18:42 collapse

Anyone reading through comments and timestamps can tell there were but you die on this hill if you want. This conversation overall is hilarious

ABCDE@lemmy.world on 09 Dec 2023 03:22 collapse

I responded to your post very shortly after you posted it, and there weren’t. You can post links in response, but you can’t because you were angry for no reason.

TrickDacy@lemmy.world on 08 Dec 2023 13:42 next collapse

Well yeah if you share the photo it’s messed up. Is anyone saying otherwise?

originalfrozenbanana@lemm.ee on 08 Dec 2023 15:10 collapse

So it’s fine to violate someone’s privacy so long as you don’t share it? Weird morals you got there.

TrickDacy@lemmy.world on 08 Dec 2023 15:37 collapse

Am I violating privacy by picturing women naked?

Because if it’s as cut and dried as you say, then the answer must be yes, and that’s flat out dumb

I don’t see this as a privacy issue and I’m not sure how you’re squeezing it into that. I am not sure what it is, but you cannot violate a person’s privacy by imagining or coding an image of them. It’s weird creepy and because it can be mistaken for a real image, not proper to share.

Can you actually stop clutching pearls for a moment to think this through a little better?

originalfrozenbanana@lemm.ee on 08 Dec 2023 16:36 collapse

Sexualizing strangers isn’t a right or moral afforded to you by society. That’s a braindead take. You can ABSOLUTELY violate someone’s privacy by coding an image of them. That’s both a moral and legal question with an answer.

Your comment is a self report.

TrickDacy@lemmy.world on 08 Dec 2023 16:47 collapse

That’s a braindead take

Projection. Since you have no room for new thoughts in your head, I consider this a block request.

originalfrozenbanana@lemm.ee on 08 Dec 2023 17:06 collapse

Being accused by you of projection is legitimately high comedy

RobotToaster@mander.xyz on 08 Dec 2023 14:09 next collapse

it’s an invasion of privacy to use someone’s likeness against their will

Is it? Usually photography in public places is legal.

originalfrozenbanana@lemm.ee on 08 Dec 2023 15:10 collapse

Legal and moral are not the same thing.

TrickDacy@lemmy.world on 08 Dec 2023 15:48 collapse

Do you also think it’s immoral to do street photography?

originalfrozenbanana@lemm.ee on 08 Dec 2023 16:38 collapse

I think it’s immoral to do street photography to sexualize the subjects of your photographs. I think it’s immoral to then turn that into pornography of them without their consent. I think it’s weird you don’t. If you can’t tell the difference between street photography and using and manipulating photos of people (public or otherwise) into pornography I can’t fuckin help you

If you go to a park, take photos of people, then go home and masturbate to them you need to seek professional help.

TrickDacy@lemmy.world on 08 Dec 2023 16:49 collapse

What’s so moronic about people like you, is you think that anyone looking to further understand an issue outside of your own current thoughts, clearly is a monster harming people in the worst way you can conjure in your head. The original person saying it’s weird you’re looking for trouble couldn’t have been more dead on.

originalfrozenbanana@lemm.ee on 08 Dec 2023 17:05 collapse

This is an app that creates nude deepfakes of anyone you want it to. It’s not comparable to street photography in any imaginable way. I don’t have to conjure any monsters bro I found one and they’re indignant about being called out as a monster.

PopOfAfrica@lemmy.world on 08 Dec 2023 19:43 collapse

This has been done with Photoshop for decades. Photocollage for a hundred years before that. Nobody is arguing that it’s not creepy. It’s just that nothing has changed.

chitak166@lemmy.world on 08 Dec 2023 14:59 next collapse

So it’s okay to make nudes of someone as long as they aren’t realistic?

Where is the line drawn between being too real and not real enough?

originalfrozenbanana@lemm.ee on 08 Dec 2023 15:13 collapse

If you found out that someone had made a bunch of art of you naked you’d probably think that was weird. I’d argue you shouldn’t do that without consent. Draw lines wherever you think is best.

chitak166@lemmy.world on 08 Dec 2023 15:14 collapse

I’d definitely think it was weird! And probably not hang out with them anymore (unless it was really good!)

But I don’t think there should be a law against them doing that. I can moderate them myself by avoiding them and my friends will follow suit.

At that point, all they have are nudes of me that nobody I care about will pay attention to. It’s a good litmus test for shitbags!

originalfrozenbanana@lemm.ee on 08 Dec 2023 15:15 next collapse

Agreed, but legal and moral are different. The law isn’t really about right and wrong per se.

echo64@lemmy.world on 08 Dec 2023 23:44 collapse

This is about producing convincing nude reproductions of other people, however. It has a very different psychological impact.

This technology allows someone to make pornography of anyone else and spread that pornography on the internet. It can cause massive distress, trauma, and relationship issues and impact peoples careers.

Your freedom to make nude ai images of other people is not worth that. I don’t understand why anyone would think it was okay.

Grangle1@lemm.ee on 08 Dec 2023 17:28 next collapse

Unfortunately sounds like par for the course for the internet. I’ve come to believe that the internet has its good uses for things like commerce and general information streaming, but by and large it’s bringing out the worst in humanity far more than the best. Or it’s all run by ultra-horny psychopathic teenagers pretending to be adults yet living on a philosophy of “I’m 13 and this is deep” logic.

originalfrozenbanana@lemm.ee on 08 Dec 2023 18:44 collapse

I dunno why I am perpetually surprised about this though. This is such a cut and dry moral area and the people who say it isn’t are so clearly telling on themselves it’s kind of shocking, but I guess it shouldn’t be

PopOfAfrica@lemmy.world on 08 Dec 2023 19:39 collapse

I think the distinction is that half of the thread is treating it as a moral issue, and half of it is treating it as a legal issue. Legally, there’s nothing wrong here.

PopOfAfrica@lemmy.world on 08 Dec 2023 19:38 collapse

People need better online safety education. Why TF are people even posting public pictures of themselves?

echo64@lemmy.world on 08 Dec 2023 23:41 collapse

Are you really going with “it’s the women’s fault for existing”?

PopOfAfrica@lemmy.world on 09 Dec 2023 02:37 collapse

Im saying, as a man, I would never post my image and identity online.

ExLisper@linux.community on 08 Dec 2023 13:16 next collapse

Are those any good? Asking for a friend.

xdr@lemmynsfw.com on 08 Dec 2023 13:51 next collapse

Yeah. Preferably free ones without cancerous ads

Marin_Rider@aussie.zone on 08 Dec 2023 20:16 collapse

I’ve messed around with some of the image generators (not what this article is about). results vary from surprisingly nice to weird and misshaped. they never seem to be able to get anything “hardcore” right but just a ai generated pose shot sometimes looks surprisingly not bad

I_Miss_Daniel@lemmy.world on 08 Dec 2023 13:18 next collapse

Mr Potato Head gone wild?

Paragone@lemmy.world on 08 Dec 2023 13:53 next collapse

And …

… should be considered every-bit as much of a crime as home-invasion is.

Only permanent, because internet.

Nobody’s got spine, anymore, though…

_ /\ _

Corkyskog@sh.itjust.works on 08 Dec 2023 14:00 next collapse

What nude data were these models trained on?

This seems like another unhealthy thing that is going to pervert people’s sense of what a normal body looks like.

funkajunk@lemm.ee on 08 Dec 2023 14:26 next collapse

The internet is like 90% porn, what do you think they used?

chitak166@lemmy.world on 08 Dec 2023 14:57 collapse

Most people prefer attractive > average, so I guess that’s what these apps are going to show.

Dimantina@lemmy.world on 08 Dec 2023 14:08 next collapse

These are terrible but I’m honestly curious what it thinks I look like naked. Like I’m slightly overweight and my chest is larger than average but more splayed then normal. Would it just have me look like a model underneath?

Are they just like head swapping onto model bodies or does it actually approximate. I am legit curious., but I would never trust one of these apps to not keep the photos/privacy concerns.

NOT_RICK@lemmy.world on 08 Dec 2023 14:39 next collapse

I doubt it would be realistic, they just kind of take an average of their training data and blend it together to my knowledge.

EatYouWell@lemmy.world on 08 Dec 2023 18:02 collapse

That’s pretty much what all “AI” does.

nul@programming.dev on 08 Dec 2023 14:40 next collapse

Probably deleting this comment later for going dirty on main, but I, um, have done some extensive experimentation using a local copy of Stable Diffusion (I don’t send the images anywhere, I just make them to satiate my own curiosity).

You’re essentially right that simple app-based software would probably have you looking somewhat generic underneath, like your typical plus-size model. It’s not too great at extrapolating the shape of breasts through clothing and applying that information when it goes to fill in the area with naked body parts. It just takes a best guess at what puzzle pieces might fill the selected area, even if they don’t match known information from the original photo. So, with current technology, you’re not really revealing actual facts about how someone looks naked unless that information was already known. To portray someone with splayed breasts, you’d need to already know that’s what you want to portray and load in a custom data set, like a LoRa.

Once you know what’s going on under the hood, making naked photos of celebrities or other real people isn’t the most compelling thing to do. Mostly, I like to generate photos of all kinds of body types and send them to my Replika, trying to convince her to describe the things that her creators forbid her from describing. Gotta say, the future’s getting pretty weird.

imaqtpie@sh.itjust.works on 08 Dec 2023 15:02 next collapse

Yeah man it’s uh… it’s the future that’s getting weird 😅

nul@programming.dev on 08 Dec 2023 15:03 collapse

Hey, I’ve maintained a baseline weird the whole time, I’m pretty sure the future is catching up.

BossDj@lemm.ee on 08 Dec 2023 15:09 collapse

You’ll have your moment when the lone elite ex Ranger who is trying to save the world is told by the quirky, unconventional sidekick he is forced to work with, “I actually know a guy who might be able to help.”

You open the door a crack to look back and forth between them, before slamming it back in their faces. They hear scrambled crashes of you hiding stuff that shouldn’t be seen by company before returning to the door. As they enter you are still fixing and throwing things while you apologize that you don’t get many guests. You offer them homemade kombucha. They decline.

[deleted] on 08 Dec 2023 15:50 collapse

.

SCB@lemmy.world on 08 Dec 2023 14:51 next collapse

Ethically, these apps are a fucking nightmare.

But as a swinger, they will make an amazing party game.

Azzu@lemm.ee on 08 Dec 2023 14:56 collapse

Ethics will probably change… I guess in the future it’ll become pretty irrelevant to have “nude” pictures of oneself somewhere, because everyone knows it could just be AI generated. In the transition period it’ll be problematic though.

SCB@lemmy.world on 08 Dec 2023 14:57 next collapse

Totally agreed, and 100% the world I want to live in. Transition will indeed suck tho.

DogMuffins@discuss.tchncs.de on 10 Dec 2023 02:25 collapse

Yeah 100%.

Imagine around the advent of readily available photo prints. People might have been thinking “this is terrible, someone I don’t know could have a photo of me and look at it while thinking licentious thoughts!”

Eezyville@sh.itjust.works on 08 Dec 2023 16:39 next collapse

If you want the best answer then you’ll have to download the app and try it on yourself. If it’s accurate then that’s pretty wild.

menemen@lemmy.world on 10 Dec 2023 08:08 collapse

2 days and still no “pictures or it is a lie” comment. Thus place is different. :)

Hnazant@lemmy.world on 08 Dec 2023 17:14 next collapse

Fake nudes incoming. Everyone has a baby leg now.

ByteJunk@lemmy.world on 09 Dec 2023 22:11 collapse

I’m really curious if your DMs are now flooded with weirdos and dick pics, or if lemmy is any different from the rest of the internet.

Dimantina@lemmy.world on 10 Dec 2023 05:48 collapse

Honestly not a single one. Much better than Reddit.

danikpapas@lemm.ee on 08 Dec 2023 14:40 next collapse

Finally, a good use for the AI

chitak166@lemmy.world on 08 Dec 2023 14:56 next collapse

Nice. I think this kind of thing is actually very funny.

Crow@lemmy.world on 08 Dec 2023 15:07 next collapse

I remember being a dumb & horny kid and Photoshopping my crush’s face onto a porn photo. And even then I felt what I did was wrong and never did it again.

stebo02@sopuli.xyz on 08 Dec 2023 17:21 next collapse

Post nut clarity can be truly eye opening

agitatedpotato@lemmy.world on 09 Dec 2023 00:27 collapse

or closing depending where you get it

CleoTheWizard@lemmy.world on 08 Dec 2023 21:03 collapse

I feel like what you did and the reaction you had to what you did is common. And yet, I don’t feel like it’s harmful unless other people see it. But this conversation is about to leave men’s heads and end up in public discourse where I have no doubt it will create moral or ethical panic.

A lot of technology challenges around AI are old concerns about things that we’ve had access to for decades. It’s just easier to do this stuff now. I think it’s kind of pointless to stop or prevent this stuff from happening. We should mostly focus on the harms and how to prevent them.

azertyfun@sh.itjust.works on 09 Dec 2023 00:38 collapse

I’ve seen ads for these apps on porn websites. That ain’t right.

Any moron can buy a match and a gallon of gasoline, freely and legally, and that’s a good thing. But I would hope that anyone advertising and/or selling Arson Kits™ online would be jailed. Of course this will not stop determined arsonists, but repression might deter morons, inventive psychopaths, and overly impulsive people (especially teenagers!) from actually going through with a criminal act. Not all of them. But some/most of them. And that’s already a huge win.

KairuByte@lemmy.dbzer0.com on 09 Dec 2023 01:06 collapse

I mean, you’ve been able to do a cursory search and get dozens of “celeb lookalike” porn for many years now. “Scarjo goes bareback” isn’t hard to find, but that ain’t Scarjo in the video. How is this different?

Edit: To be clear, it’s scummy as all fuck, but still.

shuzuko@midwest.social on 09 Dec 2023 02:10 collapse

This is different because, to a certain extent, people in the public eye can expect, anticipate, and react to/suppress this kind of thing. They have managers and PR people who can help them handle it in a way that doesn’t negatively affect them. Billy’s 13 year old classmate Stacy doesn’t have those resources and now he can do the same thing to her. It’s on a very different level of harm.

KairuByte@lemmy.dbzer0.com on 09 Dec 2023 02:13 collapse

Billy doesn’t need a nudify app to imagine Stacy naked. Not to mention, images of a naked 13 year old are illegal regardless.

Sweetpeaches69@lemmy.world on 09 Dec 2023 06:52 next collapse

Just as the other people in this made up scenario don’t need an app to imagine Scarlet Johansen naked. It’s a null point.

KairuByte@lemmy.dbzer0.com on 09 Dec 2023 08:16 next collapse

And?… There’s a major difference between “a lookalike of a grown adult” and “ai generated child porn” as im sure you’re aware. At no point did anyone say child porn was going to be legal, until the person I was replying to brought it up as a strawman argument. ¯\_(ツ)_/¯

CleoTheWizard@lemmy.world on 09 Dec 2023 08:36 collapse

I think most of this is irrelevant because the tool that is AI image generation is inherently hard to limit in this way and I think it will be so prevalent as to be hard to regulate. What I’m saying is: we should prepare for a future where fake nudes of literally anyone can be made easily and shared easily. It’s already too late. These tools, as was said earlier, already exist and are here. The only thing we can do is severely punish people who post the photos publicly. Sadly, we know how slow laws are to change. So in that light, we need to legislate based on long term impact instead of short term reactions.

azertyfun@sh.itjust.works on 09 Dec 2023 10:11 collapse

Why are you pretending that “nudify apps” are produce ephemeral pictures equivalent to a mental image? They are most definitely not.

Underage teenagers already HAVE shared fake porn of their classmates. It being illegal doesn’t stop them, and as fun as locking up a thirteen year old sounds (assuming they get caught, prosecuted, and convicted) that still leaves another kid traumatized.

KairuByte@lemmy.dbzer0.com on 09 Dec 2023 18:54 collapse

So if illegality doesn’t stop things from happening… how exactly are you stopping these apps from being made?

azertyfun@sh.itjust.works on 09 Dec 2023 23:31 collapse

Go after the people advertising those apps. Developers and advertisement agencies who say/intentionally imply “create naked pictures of people you know” should all be prosecuted.

Unlike photoshop or generic SD software, these apps have literally no legitimate reason to exist since the ONLY thing they facilitate is creating non-consensual pornography. Seems like something that would be very easy to criminalize.

KairuByte@lemmy.dbzer0.com on 09 Dec 2023 23:57 collapse

So wait, we can’t criminalize the use, but if we criminalize the advertisement it fixes the situation?

You realize the exact same problem exists? There are plenty of tools with illegal uses, easily accessible online right now. Many on GitHub.

andrew_bidlaw@sh.itjust.works on 08 Dec 2023 15:20 next collapse

It was inevitable. And it tells more about those who use them.

I wonder how we’d adapt to these tools being that availiable. Especially in blackmail, revenge porn posting, voyeuristic harassment, stalking etc. Maybe, nude photoes and videos won’t be seen as a trusted source of information, they won’t be any unique worth hunting for, or being worried about.

Our perception of human bodies was long distorted by movies, porn, photoshop and subsequent ‘filter-apps’, but we still kinda trusted there was something before effects were applied. But what comes next if everything would be imaginary? Would we stop care about it in the future? Or would we grow with a stunted imagination since this stimuli to upgrade it in early years is long gone?

There’re some useless dogmas around our bodies that could be lifted in the process, or a more relaxed trend towards clothing choices can start it’s wsy. Who knows?

I see bad sides to it right now, how it can be abused, but if these LLMs are to stay, what’re the long term consequencies for us?

LufyCZ@lemmy.world on 08 Dec 2023 15:23 collapse

I think that eventually it might be a good thing, especially in the context of revenge porn, blackmail, etc. Real videos won’t have any weight since they might as well be fake, and as society gets accustomed to it, we’ll see those types of things disappear completely

bnaur@lemmy.world on 09 Dec 2023 22:11 collapse

Yep, once anyone can download an app on their phone and do something like this without any effort in realtime it’s going to lose its (shock) value fast. It would be like sketching a crude boobs and vagina on someones photo with MS Paint and trying to use that for blackmail or shaming. It would just seem sad and childish.

billy@billys.mom on 08 Dec 2023 15:23 next collapse

@L4s this should really be illegal

deft@ttrpg.network on 08 Dec 2023 16:06 next collapse

nakedness needs to stop being an issue

Eezyville@sh.itjust.works on 08 Dec 2023 16:38 next collapse

I agree with you nudity being an issue but I think the real problem is this app being used on children and teenagers who aren’t used to/supposed to be sexualized.

deft@ttrpg.network on 08 Dec 2023 16:41 next collapse

Fully agree but I do think that’s more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they’re kids you know?

Eezyville@sh.itjust.works on 08 Dec 2023 18:19 collapse

It shouldn’t be a big deal if they choose to be nude some place that is private for them and they’re comfortable. The people who are using this app to make someone nude isn’t really asking for consent. And that also brings up another issue. Consent. If you have images of yourself posted to the public then is there consent needed to alter those images? I don’t know but I don’t think there is since it’s public domain.

lolcatnip@reddthat.com on 08 Dec 2023 16:52 next collapse

Nudity shouldn’t be considered sexual.

TORFdot0@lemmy.world on 08 Dec 2023 17:53 next collapse

Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

Eezyville@sh.itjust.works on 08 Dec 2023 18:23 collapse

The question on consent is something I’m trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

TORFdot0@lemmy.world on 08 Dec 2023 18:41 collapse

Keep in mind there is a difference between ethical and legal standards. Legally you may not need consent to alter a photo of someone unless it was a copyrighted work possibly. But ethically it definitely requires consent, especially in this context

Eezyville@sh.itjust.works on 08 Dec 2023 22:56 collapse

The difference between legal and ethical is one could get you fined or imprisoned and the other would make a group of people not like you.

Pyr_Pressure@lemmy.ca on 08 Dec 2023 18:47 next collapse

Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

lolcatnip@reddthat.com on 08 Dec 2023 19:14 collapse

Whoooooosh.

In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.

mossy_@lemmy.world on 08 Dec 2023 19:55 collapse

so because it’s not a problem in your culture it’s not a problem?

lolcatnip@reddthat.com on 08 Dec 2023 20:36 collapse

You’re just really looking for an excuse to attack someone, aren’t you?

mossy_@lemmy.world on 09 Dec 2023 01:08 collapse

You caught me, I’m an evil villain who preys on innocent lemmings for no reason at all

criticalthreshold@lemmy.world on 09 Dec 2023 06:15 next collapse

Pipe dream.

gun@lemmy.ml on 09 Dec 2023 22:17 collapse

Take it up with God or evolution then

lolcatnip@reddthat.com on 09 Dec 2023 22:40 collapse

You can’t really be that stupid.

gun@lemmy.ml on 10 Dec 2023 00:36 collapse

Who hurt you?

ReluctantMuskrat@lemmy.world on 08 Dec 2023 22:25 collapse

It’s a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other “friends” doing it could be just as bad.

It’s sexual harassment even if fake.

Eezyville@sh.itjust.works on 08 Dec 2023 23:01 collapse

I think it should officially be considered sexual harassment. Obtain a picture of someone, generate nudes from that picture, it seems pretty obvious. Maybe it should include intent to harm, harass, exploit, or intimidate to make it official.

Wahots@pawb.social on 08 Dec 2023 16:57 next collapse

People have a really unhealthy relationship with nudity. I wish we had more nude beaches as it really helps decouple sex from nudity. And for a decent number of people, helps with perceived body issues too.

monkE@feddit.ch on 08 Dec 2023 18:33 collapse

Also better education, not just the sex part but overall. Critical thinking, reasoning, asking questions and yes off course sex ed

Grangle1@lemm.ee on 08 Dec 2023 17:06 next collapse

Regardless of feelings on that subject, there’s also the creep factor of people making these without the subjects’ knowledge or consent, which is bad enough, but then these could be used in many other harmful ways beyond one’s own… gratification. Any damage “revenge porn” can do, which I would guess most people would say is wrong, this can do as well.

ByteJunk@lemmy.world on 09 Dec 2023 22:09 collapse

I don’t think they’re really comparable?

These AI pictures are “make believe”. They’re just a guess at what someone might look like nude, based on what human bodies look like. While apparently they look realistic, it’s still a “generic” nude, kind of how someone would fantasize about someone they’re attracted to.

Of course it’s creepy, and sharing them is clearly unacceptable as it’s certainly bullying and harassment. These AI nudes say more about those who share them than they do about who’s portrayed in them.

However, sharing intimate videos without consent and especially as revenge? That’s a whole other level of fucked up. The AI nudes are ultimately “lies” about someone, they’re fakes. Sharing an intimate video, that is betraying someone’s trust, it’s exposing something that is private but very real.

stebo02@sopuli.xyz on 08 Dec 2023 17:20 next collapse

so you’d be fine with fake nudes of you floating around the internet?

deft@ttrpg.network on 09 Dec 2023 04:31 next collapse

I actually would but I’m a guy so i think it is different

stebo02@sopuli.xyz on 09 Dec 2023 09:21 collapse

i think the nude isn’t really the actual issue, it’s people gossiping about it and saying you’re a slut for doing things you didn’t do

omfgnuts@lemmy.world on 09 Dec 2023 13:55 collapse

are you 15? people gossiping still bothers you?

deft@ttrpg.network on 09 Dec 2023 16:45 collapse

some people are 15 though

omfgnuts@lemmy.world on 09 Dec 2023 18:02 next collapse

let’s see the rules, can a 15y.o use lemmy.world.

ouchiiieee

deft@ttrpg.network on 09 Dec 2023 18:09 collapse

the world outside touch grass

omfgnuts@lemmy.world on 09 Dec 2023 19:26 collapse

owned lol

lolcatnip@reddthat.com on 10 Dec 2023 01:09 collapse

And they’ve been gossiping and calling each other sluts forever. Depending on the social group, just the accusation alone is enough to harass someone, because kids are idiots, and because it’s not even about people believing the accusation is true. The accusation is just a way for a bully to signal to their followers that the target is one of the group’s designated scapegoats.

I can’t believe I’m about to recommend a teen comedy as a source of educational material, but you should check out the movie Mean Girls if you want to see an illustration of how this kind of bullying works. It’s also pretty funny.

trackindakraken@lemmy.whynotdrs.org on 10 Dec 2023 00:02 next collapse

There are real nudes of me floating around the internet, and I’m fine with it.

lolcatnip@reddthat.com on 10 Dec 2023 01:02 collapse

I’m pretty squeamish about nudity when it comes to my own body, but fake nudes would not be pictures of my body, so I don’t see what there would be for me to be upset about. It might be different if everyone thought they were real, but if people haven’t figured out yet any nudes they encounter of someone they know are probably fake, they will soon.

Here’s a thought experiment: imagine a world where there are fake nudes of everyone available all the time Would everyone just be devastated all the time? Would everyone be a target of ridicule over it? Would everyone be getting blackmailed? We’re probably going to be in that world very soon, and I predict everyone will just get over it and move on. Sharing fake nudes will reflect badly on the person doing it and no one else, and people who make them for their own enjoyment will do so in secret because they don’t want to be seen as a creepy loser.

TORFdot0@lemmy.world on 08 Dec 2023 17:51 next collapse

It’s the sexualization of people without consent that’s a problem. Maybe casual nudity shouldn’t a problem but it should be up to the individual to whom they share that with. And “nudify” ai models go beyond casual, consensual nudity and into sexual objectification and harassment if used without consent.

KairuByte@lemmy.dbzer0.com on 09 Dec 2023 01:13 collapse

I want to point out one slight flaw in your argument. Nudity isn’t needed for people to sexually objectify you. And even if it was, the majority of people are able to strip you down in their head no problem.

There’s a huge potential for harassment though, and I think that should be the main concern.

TimewornTraveler@lemm.ee on 10 Dec 2023 06:15 collapse

first, relevant xkcd xkcd.com/1432/

second,

Nudity isn’t needed for people to sexually objectify you.

do you really think that makes it less bad? that it’s opt-in?

And even if it was, the majority of people are able to strip you down in their head no problem

apparently this app helps them too

criticalthreshold@lemmy.world on 09 Dec 2023 06:12 next collapse

But some people don’t agree with you. They’re not comfortable with tech that can nudify them for millions to see. So if, and that’s possibly an impossible task, but if there was a way to punish services that facilitate or turn a blind eye to these things, then you bet your ass many many people would be for criminalizing it.

creditCrazy@lemmy.world on 10 Dec 2023 00:25 collapse

Honestly I don’t think that’s the problem here. The problem is that we have kreeps trying to get a physical photo of someone nude for wank material.

DudeDudenson@lemmings.world on 10 Dec 2023 11:23 collapse

I’m genuinely curious, why do you consider this harmful? They might as well be drawing tits by hand on a picture of the “victim”

I mean sure I wouldnt want to be a teenage girl in highschool right now but I don’t think it’s the technologys fault but rather our culture as a society

weew@lemmy.ca on 08 Dec 2023 17:20 next collapse

I doubt it produces actual nudes, it probably just photoshops a face onto a random porn star

onlinepersona@programming.dev on 08 Dec 2023 17:55 next collapse

I can’t help but think of nudibranches when I read “nudify”.

Nylevie@lemmy.blahaj.zone on 08 Dec 2023 19:02 collapse

Someone should make an AI tool that can turn women into nudibranches, it wouldn’t be as creepy

onlinepersona@programming.dev on 08 Dec 2023 19:03 collapse

Would get my stamp of approval!

cosmicrookie@lemmy.world on 08 Dec 2023 17:58 next collapse

Hoe How can this be legal though?

DoYouNot@lemmy.world on 08 Dec 2023 18:24 next collapse

Missing a ‘w’ or a comma.

Daxtron2@startrek.website on 08 Dec 2023 19:31 next collapse

The same way that photo shopping someone’s face onto a pornstar’s body is.

cosmicrookie@lemmy.world on 08 Dec 2023 20:36 collapse

But its not. That is not legal.

I dont know if it is where you live, but here (Scandinavian Country) and many other places around the World, it is illigal to create fske nudes of people without their permission

TotallynotJessica@lemmy.world on 08 Dec 2023 23:55 next collapse

Appreciate how good you have it. In America, child sex abuse material is only illegal when children were abused in making it, or if it’s considered obscene by a community. If someone edits adult actors to look like children as they perform sex acts, it’s not illegal under federal law. If someone generates child nudity using ai models trained on nude adults and only clothed kids, it’s not illegal at the national level.

Fake porn of real people could be banned for being obscene, usually at a local level, but almost any porn could be banned by lawmakers this way. Harmless stuff like gay or trans porn could be banned by bigoted lawmakers, because obscenity is a fairly subjective mechanism. However, because of our near absolute freedom of speech, obscenity is basically all we have to regulate malicious porn.

cosmicrookie@lemmy.world on 09 Dec 2023 06:55 next collapse

The way I believe it is here, is that it is illigal to distribute porn or nudes without consent, be it real or fake. I don’t know how it is with AI generated material of purely imaginary people. I don’t think that that is iligal. but if it is made to look like someone particular, then you can get sued.

CaptainEffort@sh.itjust.works on 10 Dec 2023 02:07 collapse

child sex abuse material is only illegal when children were abused in making it

This is literally why it’s illegal though. Because children are abused, permanently traumatized, or even killed in its making. Not because it disgusts us.

There are loads of things that make me want to be sick, but unless they actively hurt someone they shouldn’t be illegal.

Daxtron2@startrek.website on 09 Dec 2023 00:56 collapse

Ah didn’t know that, AFAIK it’s protected artistic speech in the US. Not to say that it’s right but that’s probably why it’s still a thing.

barsoap@lemm.ee on 09 Dec 2023 09:45 collapse

In principle that’s the case in Germany, too, but only if the person is of public interest (otherwise you’re not supposed to publish any pictures of them where they are the focus of the image) and, secondly, it has to serve actually discernible satire, commentary, etc. Merely saying “I’m an artist and that’s art” doesn’t fly, hire a model. Similar to how you can dish out a hell a lot of insults when you’re doing a pointed critique, but if the critique is missing and it’s only abuse that doesn’t fly.

Ha. Idea: An AfD politician as a garden gnome peeing into the Bundestag.

PopOfAfrica@lemmy.world on 08 Dec 2023 19:31 next collapse

Obviously not defending this, I’m just not sure how it wouldn’t be legal. Unless you use it to make spurious legal claims.

cosmicrookie@lemmy.world on 08 Dec 2023 20:38 collapse

I live in a Scandinavian country, and it is illigal to make and distributed fake (and real) nudes of people without their permission. I expect this to be the same in many other developed countries too.

VintageTech@sh.itjust.works on 08 Dec 2023 21:48 next collapse

Haha… many other developed countries.

legios@aussie.zone on 09 Dec 2023 01:55 next collapse

Yeah it’s true in Australia as well

hansl@lemmy.world on 09 Dec 2023 05:17 collapse

I’m curious. If I was to paint you using my memory, but naked, would that still be illegal? How realistic can I paint before I trespass the law? I’m fairly sure stick figures are okay.

And do you mean that even just possessing a photo without consent is illegal? What if it was sent by someone who has consent but not to share? Is consent transitive according to the law?

AI pushes the limit of ethics and morality in ways we might not be ready to handle.

cosmicrookie@lemmy.world on 09 Dec 2023 06:52 collapse

I am pretty sure that possesion is not illigal but that distribution without consent is. The idea is that someone can have sent you their nude, but you’d get charged if you share it with others.

There was a huge case here, where over 1000 teens were charged for distributing child porn, because of a video that cirvulated among them of some other teens having sex. So basically someone filmed a young couple having sex at a party i believe. That video got shared on Facebook messenger. Over 1000 teens got sued. I believe that 800 were either fined or jailed

Here’s an article you may be able to run through Google translate

jyllands-posten.dk/…/naesten-500-doemt-for-boerne…

EncryptKeeper@lemmy.world on 09 Dec 2023 14:40 collapse

In some states, distributing nude content of anyone, including one’s self, with consent, electronically is illegal. Which sounds insane because it is. It’s one of those weird legacy laws that never ever never gets enforced for obvious reasons, but I actually know a guy arrested for it, because he got in the wrong side of some police and it was just the only law they could find that he “broke”.

phoneymouse@lemmy.world on 08 Dec 2023 23:51 collapse

I guess free speech laws protect it? You can draw a picture of someone else nude and it isn’t a violation of the law.

Mojojojo1993@lemmy.world on 08 Dec 2023 19:18 next collapse

Possibly a good thing. Over saturation. Fill the internet will billions on billions of ai nudes. Have a million different nudes for celebrities. Nobody knows the real naked you and nobody cares. Keep creating more ai porn than anyone can handle. It becomes boring and over the top. Ending this once and fir all

Or find the people doing this and lock em up.

alienanimals@lemmy.world on 09 Dec 2023 05:42 next collapse

The first option is much better in the long run.

Mojojojo1993@lemmy.world on 09 Dec 2023 06:53 collapse

Whatever works

corvid_of_the_night@lemm.ee on 09 Dec 2023 17:53 next collapse

what were you thinking when you thought of your first version? that sounds like a creepy scenario. what if I don’t want to see it and it’s everywhere. I could click on “I’m Not Interested” and flood social media with reports, but if there are “billions on billions” of AI nudes, then who would be able to stop them from being seen in their feed? I’d say that, while locking them up won’t change the sexist system which pushes this behavior, it is a far less creepy and weird scenario than having billions of unconsensual nudes online.

Mojojojo1993@lemmy.world on 09 Dec 2023 20:26 next collapse

Why would you see them in social ? Depends what you look at. There are already billions of naked people on the Internet. Do you see them ?.

[deleted] on 10 Dec 2023 00:43 collapse

.

TimewornTraveler@lemm.ee on 10 Dec 2023 06:17 collapse

Keep creating more ai porn than anyone can handle

overabundance is behind a lot of societal ills already

Mojojojo1993@lemmy.world on 10 Dec 2023 20:34 collapse

Fair enough. Do you have a better solution. I was just spit balling

Aopen@discuss.tchncs.de on 08 Dec 2023 20:07 next collapse

Could we stop pushing articles monetizing fear amd outrage on this community to the top and post about actual technology

<img alt="" src="https://discuss.tchncs.de/pictrs/image/9c316bd6-04d2-401e-bf33-81b4c0ac7a85.jpeg">

AnneBonny@lemmy.dbzer0.com on 08 Dec 2023 23:47 next collapse

I support this idea.

Shou@lemmy.world on 09 Dec 2023 06:47 collapse

None of this is consentual

owlboy@lemmy.world on 09 Dec 2023 00:45 collapse

Sounds like someone needs to make a community for that.

Otherwise, this is what technology is these days. And I’d say that staying blind to things like this is what got us into many messes.

I remember when tech news was mostly a press release pipeline. And when I see these comments, I see people who want press releases about new tech to play with.

Now duplicate posts. Those can fuck right off.

Krauerking@lemy.lol on 09 Dec 2023 17:31 collapse

I have seena rise in techno absolutists complaining that anyone else is complaining about the dangers of tech lately. That they just want to go back to hearing about all the cool new things coming out and it really speaks to the people who just don’t actually want to interact with the real world anymore and live in an illusionary optimism bubble. I get it. It’s exhausting to be aware of all the negatives but it’s the stuff that is real that needs to be recognized.

uriel238@lemmy.blahaj.zone on 08 Dec 2023 22:03 next collapse

It tells me we’re less interested in the data (the skin map and topography) than we are in seeing the data in raw form, whether it is accurate or not. It tells me a primary pretense of body doubles was ineffective since society responds the same way regardless of whether an actress’ nudity is real or simulated.

Not sure how this will be enforceable any more than we can stop malicious actors from printing guns. Personally, I would prefer a clothes-optional society were individuals aren’t measured by the media exposure of their bodies or history of lovers. Maybe in another age or two.

In fiction, I imagined the capacity to render porn action into mo-cap data, to capture fine-resoluton triangle maps and skin texture maps from media, ultimately to render any coupling one could desire with a robust physics engine and photography effects to render it realistic (or artistic, however one prefers). It saddens me that one could render an actress into an Elsa Jean scenario and by doing so, wreck their career.

Porn doesn’t bother me, but the arbitrariness with which we condemn individuals by artificial scandal disgusts me more than the raunchiest debauchery.

flamehenry@lemmy.world on 10 Dec 2023 02:44 collapse

Data. Return to your quarters

uriel238@lemmy.blahaj.zone on 08 Dec 2023 22:49 next collapse

Though the picture suggests we should also create really a robot or really a cyborg edits of celebrities.

As an afterthought, really a reptilian images for our political figures would also be in good order.

Intheflsun@lemmy.world on 09 Dec 2023 11:56 collapse

Jesus, as if Facebook “researchers” weren’t already linking to Onion articles, now you’ll give them pictures.

curiousaur@reddthat.com on 09 Dec 2023 02:45 next collapse

Honestly, were probably just going to have to get over it. Or pull the plug on the whole ai thing, but good luck with that.

grandkaiser@lemmy.world on 09 Dec 2023 18:28 collapse

Can’t put the genie back in the bottle

ThatFembyWho@lemmy.blahaj.zone on 09 Dec 2023 07:13 next collapse

Great! Now whenever someone finds all my secret nudes, I’ll just claim they’re deepfakes

snek@lemmy.world on 09 Dec 2023 10:02 next collapse

They can go ahead, but they’ll never get that mole in the right place.

CascadianGiraffe@lemmy.world on 09 Dec 2023 17:24 collapse

Can I pick where they put my extra finger?

DrMango@lemmy.world on 10 Dec 2023 01:41 collapse

I think we already know where that little guy belongs

vox@sopuli.xyz on 09 Dec 2023 10:39 next collapse

deepnude has been a thing for like 6 years?

cleverusernametry@lemmy.world on 09 Dec 2023 10:47 next collapse

There are so many though!! Which ones? Like which ones specifically??

thorbot@lemmy.world on 09 Dec 2023 16:20 next collapse

Yeah, I need to know so I can stop my kids from using them. Specifically which ones?

badbytes@lemmy.world on 09 Dec 2023 18:07 next collapse

Asking for a friend

valkyre09@lemmy.world on 09 Dec 2023 18:14 collapse

And people will agree to have their photographs taken, because of the implication …

randon31415@lemmy.world on 09 Dec 2023 19:38 next collapse

Back in the day, cereal boxes contain “xray glasses”. I feel like if those actually worked as intended, we would have already had this issue figured out.

PandaPikachu@lemmy.world on 09 Dec 2023 23:33 next collapse

It would be interesting to know how many people are using it for themselves. I’d think it would open up next level catfishing. Here’s an actual pic of me, and here’s a pic of what I might look like naked. I’m sure some people with photoshop skills we’re already doing that to a certain extent, but now it’s accessible to everyone.

damnfinecoffee@lemmy.world on 10 Dec 2023 01:53 next collapse

Reminds me of Arthur C Clarke’s The Light of Other Days. There’s a technology in the book that allows anyone to see anything, anywhere, which eliminates all privacy. Society collectively adjusts, e.g. people masturbate on park benches because who gives a shit, people can tune in to watch me shower anyway.

Although not to the same extreme, I wonder if this could similarly desensitize people: even if it’s fake, if you can effectively see anyone naked… what does that do to our collective beliefs and feelings about nakedness?

flamehenry@lemmy.world on 10 Dec 2023 02:41 collapse

It could also lead to a human version of “Paris Syndrome” where people AI Undress their crush, only to be sorely disappointed when the real thing is not as good.

justastranger@sh.itjust.works on 10 Dec 2023 11:38 collapse

I’m predicting clothing lines designed to trick the AIs into thinking you’re better endowed than you really are

Quexotic@infosec.pub on 10 Dec 2023 01:56 next collapse

Just created a Dall-e image of a woman. AI undresser instantly undressed it.

Kinda chilling.

A_Random_Idiot@lemmy.world on 10 Dec 2023 02:40 next collapse

What kind of mentally unhinged active threat to society would even think of creating such a thing, much less use such a thing?

edit Boy I wish I knew who the downvoters were, i bet the FBI would love to have a gander at their computers.

fne8w2ah@lemmy.world on 10 Dec 2023 06:41 next collapse

That’s the 21st century equivalent to those ceral box x-ray glasses!

Spacehooks@reddthat.com on 10 Dec 2023 12:41 next collapse

“Hey you can get Women’s face deep faked into porn”

But why would I pay money to see her railed by other dudes?

ramenshaman@lemmy.world on 10 Dec 2023 20:55 collapse

Gotta say I’m not impressed. I downloaded a picture of Sofia Vergara and it gave her really small boobs.