‘IRL Fakes:’ Where People Pay for AI-Generated Porn of Normal People (www.404media.co)
from Stopthatgirl7@lemmy.world to technology@lemmy.world on 29 Mar 2024 09:05
https://lemmy.world/post/13674028

A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

#technology

threaded - newest

JackGreenEarth@lemm.ee on 29 Mar 2024 09:11 next collapse

That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.

M500@lemmy.ml on 29 Mar 2024 09:28 next collapse

Wait? This is a tool built into stable diffusion?

In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.

SorteKanin@feddit.dk on 29 Mar 2024 09:50 collapse

It’s not like deep fake pornography is “built in” but Stable Diffusion can take existing images and generate stuff based on it. That’s kinda how it works really. The de-facto standard UI makes it pretty simple, even for someone who’s not too tech savvy: github.com/AUTOMATIC1111/stable-diffusion-webui

M500@lemmy.ml on 29 Mar 2024 10:39 next collapse

Thanks for the link, I’ve been running some llm locally, and I have been interested in stable diffusion. I’m not sure I have the specs for it at the moment though.

TheRealKuni@lemmy.world on 29 Mar 2024 14:19 next collapse

An iPhone from 2018 can run Stable Diffusion. You can probably run it on your computer. It just might not be very fast.

TheRealKuni@lemmy.world on 30 Mar 2024 03:21 collapse

By the way, if you’re interested in Stable Diffusion and it turns out your computer CAN’T handle it, there are sites that will let you toy around with it for free, like civitai. They host an enormous number of models and many of them work with the site’s built in generation.

Not quite as robust as running it locally, but worth trying out. And much faster than any of the ancient computers I own.

bassomitron@lemmy.world on 29 Mar 2024 11:17 next collapse

Img2img isn’t always spot-on with what you want it to do, though. I was making extra pictures for my kid’s bedtime books that we made together and it was really hit or miss. I’ve even goofed around with my own pictures to turn myself into various characters and it doesn’t work out like you want it to much of the time. I can imagine it’s the same when going for porn, where you’d need to do numerous iterations and tweaking over and over to get the right look/facsimile. There are tools/SD plugins like Roop which does make transferring over faces with img2img easier and more reliable, but even then it’s still not perfect. I haven’t messed around with it in several months, so maybe it’s better and easier now.

BlackPenguins@lemmy.world on 29 Mar 2024 17:23 collapse

It depends on the models you use too. There’s specific training models data out there and all you need to do is give it a prompt of “naked” or something and it’s scary good at making something realistic in 2 minutes. But yeah, there is a learning curve at setting everything up.

Khrux@ttrpg.network on 29 Mar 2024 09:32 next collapse

I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.

I wish everyone involved in this use of AI a very awful day.

sentient_loom@sh.itjust.works on 29 Mar 2024 11:28 collapse

Imagine hiring a hit man and then realizing he hired another hit man at half the price. I think the government should compensate them.

brbposting@sh.itjust.works on 29 Mar 2024 15:54 collapse

Hitman hires hitman who hires hitman who hires hitman who hires hitman who tells police - Oct ‘19

sentient_loom@sh.itjust.works on 29 Mar 2024 15:55 collapse

Nested hit man scalpers taking advantage of overpaying client.

OKRainbowKid@feddit.de on 29 Mar 2024 10:10 next collapse

In my experience with SD, getting images that aren’t obviously “wrong” in some way takes multiple iterations with quite some time spent tuning prompts and parameters.

echo64@lemmy.world on 29 Mar 2024 10:34 next collapse

The people being exploited are the ones who are the victims of this, not people who paid for it.

sentient_loom@sh.itjust.works on 29 Mar 2024 11:26 next collapse

There are many victims, including the perpetrators.

sbv@sh.itjust.works on 29 Mar 2024 11:52 next collapse

It seems like there’s a news story every month or two about a kid who kills themselves because videos of them are circulating. Or they’re being blackmailed.

I have a really hard time thinking of the people who spend ten bucks making deep fakes of other people as victims.

sentient_loom@sh.itjust.works on 29 Mar 2024 12:59 collapse

I have a really hard time thinking

Your lack of imagination doesn’t make the plight of non-consensual AI-generated porn artists any less tragic.

[deleted] on 29 Mar 2024 12:20 collapse

.

sentient_loom@sh.itjust.works on 29 Mar 2024 12:57 collapse

Writing /s would have implied that my fellow lemurs don’t get jokes, and I give them more credit than that.

[deleted] on 29 Mar 2024 13:43 next collapse

.

sentient_loom@sh.itjust.works on 29 Mar 2024 15:14 collapse

Some people just don’t have a sense of humor.

And those people are YOU!!

Thanks for the finger-wagging, you moralistic rapist!

brbposting@sh.itjust.works on 29 Mar 2024 15:56 collapse

My sarcasm detector is between 8.5-9.5 outta ten.

Missed it this time, FWIW!

Dkarma@lemmy.world on 29 Mar 2024 13:30 collapse

No one’s a victim no one’s being exploited. Same as taping a head on a porno mag.

IsThisAnAI@lemmy.world on 29 Mar 2024 10:49 next collapse

Scam is another thing. Fuck these people selling.

But fuck dude they aren’t taking advantage of anyone buying the service. That’s not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.

NOBODY on that side of the equation are bring exploited 🤣

istanbullu@lemmy.ml on 29 Mar 2024 11:40 next collapse

it’s a “I don’t know tech” tax

sugar_in_your_tea@sh.itjust.works on 29 Mar 2024 12:38 next collapse

IDK, $10 seems pretty reasonable to run a script for someone who doesn’t want to. A lot of people have that type of arrangement for a job…

That said, I would absolutely never do this for someone, I’m not making nudes of a real person.

oce@jlai.lu on 29 Mar 2024 14:22 next collapse

That’s like 80% of the IT industry.

ColeSloth@discuss.tchncs.de on 29 Mar 2024 17:34 collapse

And mechanics exploit people needing brake jobs. What’s your point?

RobotToaster@mander.xyz on 29 Mar 2024 09:51 next collapse

This is only going to get easier. The djinn is out of the bottle.

goldteeth@lemmy.dbzer0.com on 29 Mar 2024 11:10 next collapse

“Djinn”, specifically, being the correct word choice. We’re way past fun-loving blue cartoon Robin Williams genies granting wishes, doing impressions of Jack Nicholson and getting into madcap hijinks. We’re back into fuckin’… shapeshifting cobras woven of fire and dust by the archdevil Iblis, hiding in caves and slithering out into the desert at night to tempt mortal men to sin. That mythologically-accurate shit.

BrokenGlepnir@lemmy.world on 29 Mar 2024 11:51 collapse

Have you ever seen the wishmaster movies?

db2@lemmy.world on 29 Mar 2024 12:52 collapse

Make. Your. Wishes.

roscoe@lemmy.dbzer0.com on 29 Mar 2024 16:09 next collapse

As soon as anyone can do this on their own machine with no third parties involved all laws and other measures being discussed will be moot.

We can punish nonconsensual sharing but that’s about it.

CeeBee@lemmy.world on 29 Mar 2024 16:17 next collapse

As soon as anyone can do this on their own machine with no third parties involved

We’ve been there for a while now

roscoe@lemmy.dbzer0.com on 29 Mar 2024 16:33 collapse

Some people can, I wouldn’t even know where to start. And is the photo/video generator completely on home machines without any processing being done remotely already?

I’m thinking about a future where simple tools are available where anyone could just drop in a photo or two and get anything up to a VR porn video.

CeeBee@lemmy.world on 29 Mar 2024 16:58 collapse

And is the photo/video generator completely on home machines without any processing being done remotely already?

Yes

roscoe@lemmy.dbzer0.com on 29 Mar 2024 17:06 collapse

Well…shit. It seems like any new laws are already too little too late then.

JDPoZ@lemmy.world on 29 Mar 2024 19:00 collapse

Stable Diffusion has been easily locally installed and runnable on any decent GPU for 2 years at this point.

Combine that with Civitai.com for easy to download and run models of almost anything you can imagine - IP, celebrity, concepts, etc… and the possibilities have been endless.

In fact, with completely free apps like Draw Things on iOS, which allows you to run it on YOUR PHONE locally - where you can download models, tweak, customize, hand it images directly from your mobile device’s library… making this stuff is now trivial on the go.

T156@lemmy.world on 30 Mar 2024 14:13 collapse

Tensor processors/AI accelerators have also been a thing on new hardware for a while. Mobile devices have them, Intel/Apple include them with their processors, and it’s not uncommon to find them on newer graphics cards.

That would just make it easier compared to needing quite a powerful computer for that kind of task.

neptune@dmv.social on 29 Mar 2024 16:34 collapse

I can paint as many nude images of Rihanna as I want.

yildolw@lemmy.world on 29 Mar 2024 20:44 collapse

You may be sued for damages if you sell those nude paintings of Rihanna at a large enough scale that Rihanna notices

conciselyverbose@sh.itjust.works on 29 Mar 2024 17:45 collapse

Doesn’t mean distribution should be legal.

People are going to do what they’re going to do, and the existence of this isn’t an argument to put spyware on everyone’s computer to catch it or whatever crazy extreme you can take it to.

But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.

ITGuyLevi@programming.dev on 29 Mar 2024 19:50 next collapse

While I agree in spirit, any law surrounding it would need to be very clearly worded, with certain exceptions carved out. Which I’m sure wouldn’t happen.

I could easily see people thinking something was of them, when in reality it was of someone else.

treadful@lemmy.zip on 29 Mar 2024 21:28 next collapse

Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.

cley_faye@lemmy.world on 30 Mar 2024 04:23 collapse

I’m not familiar with the US laws, but… isn’t it already some form of crime or something to distribute nude of someone without their consent? This should not change whether AI is involved or not.

T156@lemmy.world on 30 Mar 2024 14:11 collapse

It might depend on whether fabricating them wholesale would be considered a nude or not. Legally, it could be considered a different person if you’re making it, since the “nude” is someone else, and you’re putting their face on top, or it’s a complete fabrication made by a computer.

Unclear if it would still count if it was someone else and they were lying about it being the victim, for example, pretending a headless mirror-nude was sent by the victim, when it was sent by someone else.

echo64@lemmy.world on 29 Mar 2024 10:35 next collapse

Every time this comes up, all the tech nerds here like to excuse it as fine and not a bad thing at all. I am hoping this won’t happen this time, but knowing lemmys audience…

sbv@sh.itjust.works on 29 Mar 2024 11:47 next collapse

The Lemmy circlejerk is real, but excusing deep fake porn is pretty off brand for us. I’m glad the comments on this post are uniformly negative.

echo64@lemmy.world on 29 Mar 2024 13:17 collapse

sh.itjust.works/comment/10397565

kbin.social/m/technology@lemmy.world/t/…/5921190 just accept it as a new normal, it’s fine. Can’t possible have any recourse, just accept it women of the world, it’s the new normal!

sbv@sh.itjust.works on 29 Mar 2024 16:12 collapse

Okay, there are a couple of douche canoes, but generally speaking, I think we’re okay on this one.

echo64@lemmy.world on 29 Mar 2024 20:18 collapse

It is massively upvoted (for lemmy).

Thorny_Insight@lemm.ee on 29 Mar 2024 12:34 next collapse

I’m not saying it’s not a bad thing but it’s inevitable. The problem will just be getting worse and there’s no stopping it. It’s something we’re just going to need to accept as a new normal. If we can deal with living under the constant threat of nuclear armageddon then I think we can live with fake nudes aswell.

echo64@lemmy.world on 29 Mar 2024 13:17 collapse

Yeah it’s this shit I’m talking about. We have a whole legal and justice system to deal with this. No kne needs to accept sexual abuse as a new normal. This shit is weird.

Thorny_Insight@lemm.ee on 29 Mar 2024 13:27 next collapse

I’m not saying there shouldn’t be consequences for someone who is spreading these pictures with the intention to cause harm to someone’s reputation but it’s incredibly naive to think that the justice system is going to stop deepfakes when it can’t even prevent bike theft. 12 year olds are making these with their smartphones. The technology is extremely accessible and easy to use and that is not going to change. I’m sorry but you’re not putting the toothpaste back into the tube. Wait a few years and you can generate photorealistic porn videos of anyone you want.

echo64@lemmy.world on 29 Mar 2024 20:19 collapse

We can’t stop biketheft so fuck off women, your free game coz this guy said so.

Thorny_Insight@lemm.ee on 29 Mar 2024 21:48 collapse

When you start strawmanning you’ve already lost the argument.

echo64@lemmy.world on 29 Mar 2024 22:36 collapse

You might want to look up what strawmanning means. I’m just flat out mocking what you said.

Dkarma@lemmy.world on 29 Mar 2024 13:31 next collapse

No we don’t. What is happening here is not covered by current laws.

0x0@programming.dev on 29 Mar 2024 15:22 collapse

Sexual abuse?

Child pornography involves molesting a child and is a crime, as it should be.

Fake nudes have been a thing for ages and are only an issue if the targeted party takes offense. It may be slander but it’s certainly not sexual abuse.

No one is accepting sexual abuse so drop it down a notch, Karen.

sbv@sh.itjust.works on 29 Mar 2024 16:18 next collapse

only an issue if the targeted party takes offense.

Deep fakes can change how the victim is treated by other people. Especially other kids.

Upthread, someone states

we’re just going to need to accept as a new normal.

Which sounds a lot like accepting this kind of shit, regardless of what you call it.

eatthecake@lemmy.world on 30 Mar 2024 12:37 collapse

From another comment:

To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

Try to imagine watching a realistic video of yourself being abused, imagine your mother watching. That will absolutely fuck some people up, and a lot of those victims are going to be children. Shit is going to get bad.

0x0@programming.dev on 31 Mar 2024 00:13 collapse

I wouldn’t put actual non-consensual pornography and fake pornography of any kind in the same bag but, geez, I’m not a Dr.

Deep fakes do improve on the (technical) realism over 90s photoshop for sure. Doesn’t that still qualify as slander? (Also not a lawyer.)

eatthecake@lemmy.world on 31 Mar 2024 01:15 collapse

The question is why, with an internet full of porn, do men want non consensual pornography that they know women are opposed to. It’s as if the hurtfulness, the lack of consent and the control over the woman in the video are actually the point.

0x0@programming.dev on 31 Mar 2024 23:51 collapse

Non-consensual pornography is called rape and it’s a crime in most of the world.

roscoe@lemmy.dbzer0.com on 29 Mar 2024 17:33 next collapse

I think part of the difficulty discussing this is the discussions usually combine two different things. The production and distribution.

I was informed elsewhere in this thread people can already produce these images/videos on their own machines with no third parties involved or remote processing. I can’t think of a single thing that can be done about that so acceptance is all we’ve got.

Nonconsensual sharing, on the other hand, we can and should do something about. The legal system won’t be able to stop it altogether but it can push it to the fringes and stop it from becoming mainstream so any victims wouldn’t see fake images/videos of themselves proliferating everywhere.

cley_faye@lemmy.world on 30 Mar 2024 04:27 collapse

It’s not a matter of excusing it. Distribution of someone’s picture without their explicit consent, and anything like that, is inexcusable. But we’re talking about the generation of said content, which technically can’t be stopped without seriously restraining everything.

General_Effort@lemmy.world on 29 Mar 2024 11:07 next collapse

Porn of Normal People

Why did they feel the need to add that “normal” to the headline?

sentient_loom@sh.itjust.works on 29 Mar 2024 11:24 next collapse

To differentiate from celebrities.

AdamEatsAss@lemmy.world on 29 Mar 2024 11:53 next collapse

This telegram user has a hard stance on “weirdos”.

TheGrandNagus@lemmy.world on 29 Mar 2024 12:27 next collapse

Because it’s different to somebody going online and finding a stock picture of Taylor Swift

yildolw@lemmy.world on 29 Mar 2024 20:45 collapse

People who have Wikipedia articles have less of an expectation of privacy than normal people

guyrocket@kbin.social on 29 Mar 2024 12:44 next collapse

This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why "AI" being involved matters is beyond me. The result is the same: fake porn/nudes.

And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

dysprosium@lemmy.dbzer0.com on 29 Mar 2024 13:01 next collapse

Exactly this. And rather believe cryptographically sighed images by comparing hashes with the one supplied by the owner. Then it’s rather a question of trusting a specific source for a specific kind of content. A news photo of the war in Ukraine by the BBC? Check hash on their site. Their reputation is fini if a false image has been found.

T156@lemmy.world on 30 Mar 2024 13:59 collapse

At the same time, that does introduce an additional layer of work. Most people aren’t going to do that just for the extra work that it would involve, in much the same way that people today won’t track down an image back down to the original source, but usually just go by the one that they saw.

Especially for people who aren’t so cryptographically or technologically inclined that they know what a hash is, where to find one, and how to compare it (without just opening them both and checking personally).

dysprosium@lemmy.dbzer0.com on 31 Mar 2024 11:56 collapse

Sure but that’s no problem if software would do that automatically for users of big (news) sites. Browsers on desktop and apps on phones.

kent_eh@lemmy.ca on 29 Mar 2024 13:05 next collapse

People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

Dkarma@lemmy.world on 29 Mar 2024 13:29 next collapse

Not relevant. Using someone’s picture never ever required consent.

Bob_Robertson_IX@discuss.tchncs.de on 29 Mar 2024 13:40 next collapse

A kid at my high school in the early 90s would use a photocopier and would literally cut and paste yearbook headshots onto porn photos. This could also be done in bulk and doesn’t require any skills that a 1st grader doesn’t have.

driving_crooner@lemmy.eco.br on 29 Mar 2024 14:47 next collapse

And it’s looked as realistic as AI jobs?

ChexMax@lemmy.world on 29 Mar 2024 16:05 collapse

Those are easily disproven. There’s no way you think that’s the same thing. If you can pull up the source photo and it’s a clear match/copy for the fake it’s easy to disprove. AI can alter the angle, position, and expression on your face in a believable manor making it a lot harder to link the photo to source material

Bob_Robertson_IX@discuss.tchncs.de on 29 Mar 2024 18:21 collapse

This was before Google was a thing, much less reverse lookup with Google Images. The point I was making is that this kind of thing happened even before Photoshop. Photoshop made it look even more realistic. AI is the next step. And even the current AI abilities are nothing compared to what they are going to be even 6 months from now. Yes, this is a problem, but it has been a problem for a long time and anyone who has wanted to create fake nudes of someone has had the ability to easily do so for at least a generation now. We might be at the point now where if you want to make sure you don’t have fake nudes created of you, then you don’t have images of yourself published. However now that everyone has high quality cameras in their pockets, this won’t 100% protect you.

ArmokGoB@lemmy.dbzer0.com on 29 Mar 2024 15:10 next collapse

I blame electricity. Before computers, people had to learn to paint to do this. We should go back to living like medieval peasants.

0x0@programming.dev on 29 Mar 2024 15:14 next collapse

Those were the days…

KISSmyOS@feddit.de on 30 Mar 2024 06:07 next collapse

This, but unironically.

Lucidlethargy@sh.itjust.works on 30 Mar 2024 17:11 collapse

<img alt="Relevant reference from Wedding Crashers" src="https://sh.itjust.works/pictrs/image/554f2819-6c38-49b4-9a11-a3d6d14c7c07.jpeg">

nednobbins@lemm.ee on 29 Mar 2024 22:53 next collapse

As much skill as a 9 year old and a 16 year old can muster?

en.wikipedia.org/wiki/Cottingley_Fairies

Vespair@lemm.ee on 31 Mar 2024 02:41 collapse

no skill from the person doing it.

This feels entirely non-sequitur, to the point of damaging any point you’re trying to make. Whether I paint a nude or the modern Leonardi DaVinci paints a nude our rights (and/or the rights of the model, depending on your perspective on this issue) should be no different, despite the enormous chasm that exists between our artistic skill.

daddy32@lemmy.world on 29 Mar 2024 13:15 next collapse

Scale.

echo64@lemmy.world on 29 Mar 2024 13:24 next collapse

I hate this: “Just accept it women of the world, accept the abuse because it’s the new normal” techbro logic so much. It’s absolutely hateful towards women.

We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

AquaTofana@lemmy.world on 29 Mar 2024 13:50 next collapse

I don’t know why you’re being down voted. Sure, it’s unfortunately been happening for a while, but we’re just supposed to keep quiet about it and let it go?

I’m sorry, putting my face on a naked body that’s not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it’s actually believable because it’s AI generated. That is SO much worse/psychologically damaging if they find out about it.

HubertManne@kbin.social on 29 Mar 2024 13:55 next collapse

typical morning for me.

0x0@programming.dev on 29 Mar 2024 15:15 next collapse

Because gay porn is a myth I guess…

Jrockwar@feddit.uk on 29 Mar 2024 16:18 next collapse

And so is straight male-focused porn. We men seemingly are not attractive, other than for perfume ads. It’s unbelievable gender roles are still so strongly coded in 20204. Women must be pretty, men must buy products where women look pretty in ads. Men don’t look pretty and women don’t buy products - they clean the house and care for the kids.

I’m aware of how much I’m extrapolating, but a lot of this is the subtext under “they’ll make porn of your sisters and daughters” but leaving out of the thought train your good looking brother/son, when that’d be just as hurtful for them and yourself.

lud@lemm.ee on 30 Mar 2024 08:41 collapse

Or your bad looking brother or the bad looking myself.

Imo people making ai fakes for themselves isn’t the end of the world but the real problem is in distribution and blackmail.

You can get blackmailed no matter your gender and it will happen to both genders.

echo64@lemmy.world on 29 Mar 2024 20:21 collapse

Sorry if I didn’t position this about men. They are the most important thing to discuss and will be the most impacted here, obviously. We must center men on this subject too.

Thorny_Insight@lemm.ee on 29 Mar 2024 21:56 collapse

Pointing out your sexism isn’t saying we should be talking about just men. It you whose here acting all holy while ignoring half of the population.

echo64@lemmy.world on 29 Mar 2024 22:35 collapse

Yes yes, #alllivesmatter amiirte? We just ignore that 99.999% of the victims will be women, just so we can grandstand about men.

brbposting@sh.itjust.works on 29 Mar 2024 16:25 next collapse

It’s unacceptable.

We have legal and justice systems to deal with this.

For reference, here’s how we’re doing with child porn. Platforms with problems include (copying from my comment two months ago):

<img alt="" src="https://sh.itjust.works/pictrs/image/2afc9d80-2b27-4aa6-97f5-c845700b2cb7.jpeg">

Ill adults and poor kids generate and sell CSAM. Common to advertise on IG, sell on TG. Huge problem as that Stanford report shows.

<img alt="" src="https://sh.itjust.works/pictrs/image/be0d748e-a98d-4eaf-9a36-b21d51b1161e.jpeg">

Telegram got right on it (not). Fuckers.

SharkAttak@kbin.social on 29 Mar 2024 21:46 next collapse

It's not normal but neither is new: you already could cut and glue your cousin's photo on a Playboy girl, or Photoshop the hot neighbour on Stallone's muscle body. Today is just easier.

echo64@lemmy.world on 29 Mar 2024 22:38 collapse

I don’t care if it’s not new, no one cares about how new it is.

cley_faye@lemmy.world on 30 Mar 2024 04:21 collapse

How do you propose to deal with someone doing this on their computer, not posting them online, for their “enjoyment”? Mass global surveillance of all existing devices?

It’s not a matter of willingly accepting it; it’s a matter of looking at what can be done and what can not. Publishing fake porn, defaming people, and other similar actions are already (I hope… I am not a lawyer) illegal. Asking for the technology that exists, is available, will continue to grow, and can be used in a private setting with no witness to somehow “stop” because of a law is at best wishful thinking.

Ookami38@sh.itjust.works on 30 Mar 2024 19:32 collapse

There’s nothing to be done, nor should be done, for anything someone individually creates, for their own individual use, never to see the light of day. Anything else is about one step removed from thought policing - afterall what’s the difference between a personally created, private image and the thoughts on your brain?

The other side of that is, we have to have protection for people who this has or will be used against. Strict laws regarding posting or sharing material. Easy and fast removal of abusive material. Actual enforcement. I know we have these things in place already, but they need to be stronger and more robust. The one absolute truth with generative AI, versus Photoshop etc is that it’s significantly faster and easier, thus there will likely be an uptick in this kind of material, thus the need for re-examining current laws.

Assman@sh.itjust.works on 29 Mar 2024 13:31 next collapse

The same reason AR15 rifles are different than muskets

HubertManne@kbin.social on 29 Mar 2024 13:52 next collapse

This is something I can't quite get through to my wife. She does not like that I dismiss things to some degree when it does not makes sense. We get into these convos where Im like I have serious doubts about this and she is like. Are you saying it did not happen and im like. no. It may have happened but not in quite the way they say or its being portrayed in a certain manner. Im still going to take video and photos for now as being likely true but I generally want to see it from independent sources. like different folks with their phones along with cctv of some kind and such.

Pretzilla@lemmy.world on 29 Mar 2024 13:59 collapse

Ok so pay the dude $10 to put your wife’s head on someone agreeing with you. Problem solved.

HubertManne@kbin.social on 29 Mar 2024 15:59 next collapse

lol. there you go. hey you cheated on me. its in this news article right here.

roscoe@lemmy.dbzer0.com on 29 Mar 2024 16:02 collapse

I didn’t expect to get a laugh out of reading this discussion, thanks.

AstralPath@lemmy.ca on 29 Mar 2024 20:10 next collapse

This kind of attitude toward non-consensual actions is what perpetuates them. Fuck that shit.

SharkAttak@kbin.social on 29 Mar 2024 21:40 next collapse

But I saw it on tee-vee!

EatATaco@lemm.ee on 30 Mar 2024 01:35 collapse

The irony of parroting this mindless and empty talking point is probably lost on you.

SharkAttak@kbin.social on 02 Apr 2024 22:17 collapse

God, do I really have to start putting the /jk or /s back, for those who don't get it like you??

EatATaco@lemm.ee on 02 Apr 2024 22:22 collapse

Upgraded to “definitely.”

SharkAttak@kbin.social on 02 Apr 2024 23:15 collapse

Okay, okay, you won. Happy now? Now go.

EatATaco@lemm.ee on 02 Apr 2024 23:19 collapse

Ok thanks

EatATaco@lemm.ee on 30 Mar 2024 01:34 next collapse

I suck at Photoshop and Ive tried many times to get good at it over the years. I was able to train a local stable diffusion model on my and my family’s faces and create numerous images of us in all kinds of situations in 2 nights of work. You can get a snap of someone and have nudes of them tomorrow for super cheap.

I agree there is nothing to be done, but it’s painfully obvious to me that the scale and ease of it that makes it much more concerning.

T156@lemmy.world on 30 Mar 2024 13:56 collapse

Also the potential for automation/mass-production. Photoshop work still requires a person to sit down to do the actual photoshop. You can try to script things out, but it’s hardly an easy affair.

By comparison, generative models are much more hands-free. Once you get the basics set up, you can just have it go, and churn things at rates well surpassing what a single human could reasonably do (if you have the computing power for it).

A_Very_Big_Fan@lemmy.world on 31 Mar 2024 20:23 collapse

Why “AI” being involved matters is beyond me.

The AI hysteria is real, and clickbait is money.

OozingPositron@feddit.cl on 29 Mar 2024 13:07 next collapse

You don’t even need to pay, some people do it for free on /b/

bigkahuna1986@lemmy.ml on 29 Mar 2024 14:19 next collapse

This business is going to get out of control. It’s going to get out of control and we’ll be lucky to live through it.

flower3@feddit.de on 29 Mar 2024 15:40 next collapse

I doubt tbh that this is the most severe harm of generative AI tools lol

Luisp@lemmy.dbzer0.com on 29 Mar 2024 15:46 next collapse

Israeli racial recognition program for example

CeeBee@lemmy.world on 29 Mar 2024 16:15 collapse

FR is not generative AI, and people need to stop crying about FR being the boogieman. The harm that FR can potentially cause has been covered and surpassed by other forms of monitoring, primarily smartphone and online tracking.

BleatingZombie@lemmy.world on 29 Mar 2024 16:20 collapse

I wholeheartedly disagree on it being surpassed

If someone doesn’t have a phone and doesn’t go online then they can still be tracked by facial recognition. Someone who has never agreed to any Terms and Conditions can still be tracked by facial recognition

I don’t think there’s anything as dubious as facial recognition due to its ability to track almost anyone regardless of involvement with technology

neatchee@lemmy.world on 29 Mar 2024 16:39 collapse

You don’t need to be online or use a digital device to be tracked by your metadata. Your credit card purchases, phone calls, vehicle license plate, and more can all be correlated.

Additionally, saying “just don’t use a phone” is no different than saying “just wear a mask outside your house”. Both are impractical, if not functionally impossible, in modern society

I’m not arguing which is “worse”, only speaking to the reality we live in

BleatingZombie@lemmy.world on 29 Mar 2024 16:41 collapse

I am arguing which is worse. There are people in Palestine who don’t have the internet, don’t have a phone, and don’t have a credit card. How are they being tracked without facial recognition?

I also didn’t say don’t use a phone. I don’t know where you got that

neatchee@lemmy.world on 29 Mar 2024 17:07 collapse

I know what you’re arguing and why you’re arguing it and I’m not arguing against you.

I’m simply adding what I consider to be important context

And again, the things I listed specifically are far from the only ways to track people. Shit, we can identify people using only the interference their bodies create in a wifi signal, or their gait. There are a million ways to piece together enough details to fingerprint someone. Facial recognition doesn’t have a monopoly on that bit of horror

FR is the buzzword boogieman of choice, and the one you are most aware of because people who make money from your clicks and views have shoved it in front of your face. But go ahead and tell me about what the “real threat” is 👍👍👍

BleatingZombie@lemmy.world on 29 Mar 2024 17:51 collapse

I didn’t say “real threat” either. I’m not sure where you’re getting these things I’m not saying

I think facial recognition isn’t as much of a “buzzword” as much as it is just the most prevalent issue that affects the most people. Yes there are other ways to track people, but none that allow you to easily track everybody regardless of their involvement with modern technology other than facial recognition

(Just to be clear I’m not downvoting you)

neatchee@lemmy.world on 29 Mar 2024 18:08 collapse

That’s why I put “real threat” in quotes ; I was paraphrasing what I consider to be the excessive focus on FR

I’m a security professional. FR is not the easiest way to track everybody/anybody. It’s just the most visible and easily grok’d by the general public because it’s been in movies and TV forever

To whit, FR itself isn’t what makes it “easy”, but rather the massive corpus of freely available data for training combined with the willingness of various entities to share resources (e.g. Sharing surveillance video with law enforcement).

What’s “easiest” entirely depends on the context, and usually it’s not FR. If I’m trying to identify the source of a particular set of communications, FR is mostly useless (unless I get lucky and identify, like, the mailbox they’re using or something silly like that). I’m much more interested in voice identification, fingerprinting, geolocation, etc in that scenario

Again, FR is just…known. And visible. And observable in its use for nefarious purposes by shitty governments and such.

It’s the stuff you don’t see on the news or in the movies that you should really be worried about

(and I’m not downvoting you either; that’s for when things don’t contribute, or deserve to be less visible because of disinformation; not for when you disagree with someone)

sbv@sh.itjust.works on 29 Mar 2024 17:23 next collapse

I’m pretty sure the AI enabled torture nexus is right around the corner.

Sanctus@lemmy.world on 29 Mar 2024 19:00 collapse

Pretty sure we will see fake political candidates that actually garner votes soon here.

SnotFlickerman@lemmy.blahaj.zone on 29 Mar 2024 19:31 next collapse

The Waldo Moment manifest.

captain_aggravated@sh.itjust.works on 30 Mar 2024 04:06 collapse

Re-elect Deez Nuts.

curiousaur@reddthat.com on 29 Mar 2024 19:27 next collapse

You can get 300 tokens in pornx dot ai for $9.99. This guy is ripping people off.

noxy@yiffit.net on 29 Mar 2024 23:39 next collapse

I wonder how holodecks handle this…

shasta@lemm.ee on 30 Mar 2024 01:27 next collapse

They send you to therapy because “it’s not healthy to live in a fantasy.”

antlion@lemmy.dbzer0.com on 30 Mar 2024 05:37 collapse

Don’t be like Lt Reg Barclay

9488fcea02a9@sh.itjust.works on 30 Mar 2024 17:59 collapse

Probably the same types of guardrails chatGPT has when you ask it to tell you how to cook meth or build a dirty bomb

And maybe Data was distributing jailbroken holodeck programs for pervs on the ship

SendMePhotos@lemmy.world on 30 Mar 2024 00:28 next collapse

I’d like to share my initial opinion here. “non consential Ai generated nudes” is technically a freedom, no? Like, we can bastardize our president’s, paste peoples photos on devils or other characters, why is Ai nudes where the line is drawn? The internet made photos of trump and putin kissing shirtless.

LadyAutumn@lemmy.blahaj.zone on 30 Mar 2024 02:31 next collapse

They’re making pornography of women who are not consenting to it when that is an extremely invasive thing to do that has massive social consequences for women and girls. This could (and almost certainly will) be used on kids too right, this can literally be a tool for the production of child pornography.

Even with regards to adults, do you think this will be used exclusively on public figures? Do you think people aren’t taking pictures of their classmates, of their co-workers, of women and girls they personally know and having this done to pictures of them? It’s fucking disgusting, and horrifying. Have you ever heard of the correlation between revenge porn and suicide? People literally end their lives when pornographic material of them is made and spread without their knowledge and consent. It’s terrifyingly invasive and exploitative. It absolutely can and must be illegal to do this.

cley_faye@lemmy.world on 30 Mar 2024 04:16 collapse

It absolutely can and must be illegal to do this.

Given that it can be done in a private context and there is absolutely no way to enforce it without looking into random people’s computer unless they post it online publicly, you’re just asking for a new law to reassure people with no effect. That’s useless.

WaxedWookie@lemmy.world on 30 Mar 2024 08:50 next collapse

Strange of you to respond to a comment about the fakes being shared in this way…

Do you have the same prescriptions in relation to someone with a stash of CSAM, and if not, why not?

cley_faye@lemmy.world on 30 Mar 2024 12:05 collapse

No. Because in one case, someone ran a program on his computer and the output might hurt someone else feelings if they ever find out, and in the other case people/kid were exploited for sexual purpose to begin with and their live torn appart regardless of the diffusion of the stuff?

How is that a hard concept to understand?

LadyAutumn@lemmy.blahaj.zone on 30 Mar 2024 20:48 next collapse

How can you describe your friends, family, co-workers, peers, making and sharing pornography of you, and say that it comes down to hurt feelings??? It’s taking someone’s personhood, their likeness, their autonomy, their privacy, and reducing them down to a sexual act for which they provide no knowledge or consent. And you think this stays private?? Are you kidding me?? Men have literally been caught making snapchat groups dedicated to sharing their partner’s nudes without their consent. You either have no idea what you’re talking about or you are intentionally downplaying the seriousness of what this is. Like I said in my original comment, people contemplate and attempt suicide when pornographic content is made and shared of them without their knowledge and consent. This is an incredibly serious discussion.

It is people like you, yes you specifically, that provide the framework by which the sexual abuse of women is justified.

afraid_of_zombies@lemmy.world on 31 Mar 2024 01:57 collapse

I agree. For any guy here who doesn’t care imagine one of your “friends” made an AI porn of you where you have a micropenis and erection problems. I doubt you would be over the moon about it. Or if that doesn’t work imagine it was someone you love. Maybe you don’t want your grandma’s face in a porn.

WaxedWookie@lemmy.world on 31 Mar 2024 02:10 collapse

What’s hard to understand is why you skipped the question I asked, and answered a different one instead.

The creation of the CSAM is unquestionably far more harmful, but I wasn’t talking about the *creation *- I was talking about the possession. The harm of the creation is already done, and whether or not the material exists after that does nothing to undo that harm.

Again, is your prescription the same as it relates to the possession, not generation of CSAM?

LadyAutumn@lemmy.blahaj.zone on 30 Mar 2024 20:49 collapse

No, I’m saying make it so that you go to prison for taking pictures of someone and making pornography of them without their consent. Pretty straightforward. If you’re found doing it, off to rot in prison with you.

GhostTheToast@lemmy.world on 30 Mar 2024 02:36 next collapse

Don’t get me wrong it’s unsettling, but I agree, I don’t see the initial harm. I see it as creating a physical manifestation of someone’s inner thoughts. I can definitely see how it could become or encourage dangerous situations, but that’s like banning alcohol because it could lead to drunk driving or sexual assault.

Mastengwe@lemm.ee on 30 Mar 2024 05:27 collapse

Innocently drinking alcohol is in NO WAY compared to creating deepfakes of people without consent.

One is an innocent act that has potentially harsh consequences, the other is a disgusting and invasively violating act that has the potential to ruin an innocent persons life.

abhibeckert@lemmy.world on 30 Mar 2024 04:27 next collapse

The internet made photos of trump and putin kissing shirtless.

And is that OK? I mean I get it, free speech, but just because congress can’t stop you from expressing something doesn’t mean you actually should do it. It’s basically bullying.

Imagine you meet someone you really like at a party, they like you too and look you up on a social network… and find galleries of hardcore porn with you as the star. Only you’re not a porn star, those galleries were created by someone who specifically wanted to hurt you.

AI porn without consent is clearly illegal in almost every country in the world, and the ones where it’s not illegal yet it will be illegal soon. The 1st amendment will be a stumbling block, but it’s not an impenetrable wall - congress can pass laws that limit speech in certain edge cases, and this will be one of them.

WaxedWookie@lemmy.world on 30 Mar 2024 06:04 collapse

The internet made photos of trump and putin kissing shirtless.

And is that OK?

I’m going to jump in on this one and say yes - it’s mostly fine.

I look at these things through the lens of the harm they do and the benefits they deliver - consequentialism and act utilitarianism.

The benefits are artistic, comedic and political.

The “harm” is that Putin and or Trump might feel bad, maaaaaaybe enough that they’d kill themselves. All that gets put back up under benefits as far as I’m concerned - they’re both extremely powerful monsters that have done and will continue to do incredible harm.

The real harm is that such works risk normalising this treatment of regular folk, which is genuinely harmful. I think that’s unlikely, but it’s impossible to rule out.

Similarly, the dissemination of the kinds of AI fakes under discussion is a negative because they do serious,measurable harm.

Mananasi@feddit.nl on 30 Mar 2024 07:55 collapse

I think that is okay because there was no intent to create pornography there. It is a political statement. As far as I am concerned that falls under free speech. It is completely different from creating nudes of random people/celebrities with the sole purpose of wanking off to it.

SendMePhotos@lemmy.world on 30 Mar 2024 14:18 collapse

Is that different than wanking to clothed photos of the same people?

RageAgainstTheRich@lemmy.world on 30 Mar 2024 17:46 collapse

The difference is that the image is fake but you can’t really see that its fake. Its so easily created using these tools and can be used to harm people.

The issue isn’t that you’re jerking off to it. The issue is it can create fake photos of situations of people that can be incredibly difficult to deny it really happened.

UsernameIsTooLon@lemmy.world on 30 Mar 2024 04:41 next collapse

Lemme put it this way. Freedom of speech isn’t freedom of consequences. You talk shit, you’re gonna get hit. Is it truly freedom if you’re infringing on someone else’s rights?

John_McMurray@lemmy.world on 30 Mar 2024 04:54 collapse

Yeah you don’t have the right to prevent people from drawing pictures of you, but you do have the right not to get hit by some guy you’re drawing.

antlion@lemmy.dbzer0.com on 30 Mar 2024 05:45 next collapse

Seems to fall under any other form of legal public humiliation to me, UNLESS it is purported to be true or genuine. I think if there’s a clear AI watermark or artists signature that’s free speech. If not, it falls under Libel - false and defamatory statements or facts, published as truth. Any harmful deep fake released as truth should be prosecuted as Libel or Slander, whether it’s sexual or not.

Maggoty@lemmy.world on 30 Mar 2024 05:58 next collapse

It’s a far cry from making weird memes to making actual porn. Especially when it’s not easily seen as fake.

douglasg14b@lemmy.world on 30 Mar 2024 15:31 collapse

I think their point is where is the line and why is the line where it is?

Maggoty@lemmy.world on 30 Mar 2024 20:14 collapse

Psychological trauma. Normal people aren’t used to dealing with that and even celebrities seek help for it. Throw in the transition period where this technology is not widely known and you have careers on the line too.

Ookami38@sh.itjust.works on 30 Mar 2024 19:21 next collapse

I think the biggest thing with that is trump and Putin live public lives. They live lives scrutinized by media and the public. They bought into those lives, they chose them. Due to that, there are certain things that we push that they wouldn’t necessarily be illegal if we did them to a normal, private citizen, but because your life is already public we turn a bit of a blind eye. And yes, this applies to celebrities, too.

I don’t necessarily think the above is a good thing, I think everyone should be entitled to some privacy, having the same thing done to a normal person living a private life is a MUCH more clear violation of privacy.

afraid_of_zombies@lemmy.world on 31 Mar 2024 01:54 collapse

Public figures vs private figures. Fair or not a public figure is usually open season. Go ahead and make a comic where Ben Stein rides a horse home to his love nest with Ben Stiller.

GrymEdm@lemmy.world on 30 Mar 2024 04:54 next collapse

To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

I’ll admit I used to look at celeb deepfakes, but once I saw that video I stopped immediately and avoid it as much as I possibly can. I believe porn can be done correctly with participant protection and respect. Regarding deepfakes/revenge porn though that statistic about suicidal ideation puts it outside of healthy or ethical. Obviously I can’t make that decision for others or purge the internet, but the fact that there’s such regular and extreme harm for the (what I now know are) victims of non-consensual porn makes it personally immoral. Not because of religion or society but because I want my entertainment to be at minimum consensual and hopefully fun and exciting, not killing people or ruining their happiness.

I get that people say this is the new normal, but it’s already resulted in trauma and will almost certainly continue to do so. Maybe even get worse as the deepfakes get more realistic.

PipedLinkBot@feddit.rocks on 30 Mar 2024 04:54 next collapse

Here is an alternative Piped link(s):

once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

Regrettable_incident@lemmy.world on 30 Mar 2024 05:52 next collapse

I’m wondering if this may already be illegal in some countries. Revenge porn laws now exist in some countries, and I’m not sure if the legislation specifies how the material should be produced to qualify. And if the image is based on a minor, that’s often going to be illegal too - some places I hear even pornographic cartoons are illegal if they feature minors. In my mind people who do this shit are doing something pretty similar to putting hidden cameras in bathrooms.

lud@lemm.ee on 30 Mar 2024 08:34 next collapse

once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

Not saying that they are justified or anything but wouldn’t people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.

Drewelite@lemmynsfw.com on 30 Mar 2024 12:04 next collapse

I think this is realistically the only way forward. To delegitimize any kind of nudes that might show up of a person. Which could be good. But I have no doubt that highschools will be flooded with bullies sending porn around of innocent victims. As much as we delegitimize it as a society, it’ll still have an effect. Like social media, though it’s normal for anyone to reach you at any time, It still makes cyber bullying more hurtful.

afraid_of_zombies@lemmy.world on 31 Mar 2024 01:52 collapse

Well if you are sending nudes to someone in high school you are sending porn to a minor. Which I am pretty confident is illegal already. I just would rather not search for that law.

Drewelite@lemmynsfw.com on 31 Mar 2024 02:35 collapse

<img alt="" src="https://lemmynsfw.com/pictrs/image/64ccabf7-40de-4a6c-950a-9f0ef3d493ba.jpeg">

eatthecake@lemmy.world on 30 Mar 2024 12:23 collapse

The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you’ll get a whole lot of complex PTSD instead.

stephen01king@lemmy.zip on 30 Mar 2024 19:00 collapse

People used to think their lives are over if they were caught alone with someone of the opposite sex they’re not married to. That is no longer the case in western countries due to normalisation.

The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.

eatthecake@lemmy.world on 30 Mar 2024 23:49 next collapse

The thing that makes them want to die is societal pressure, not the act itself.

That’s an assumption that you have no evidence for. You are deciding what feelings people should have by your own personal rules and completely ignoring the people who are saying this is a violation. What gives you the right to tell people how they are allowed to feel?

too_much_too_soon@lemmy.world on 31 Mar 2024 05:48 collapse

Agreed.

"I’ve been in HR since '95, so yeah, I’m old, lol. Noticed a shift in how we view old social media posts? Those wild nights you don’t remember but got posted? If they’re at least a decade old, they’re not as big a deal now. But if it was super illegal, immoral, or harmful, you’re still in trouble.

As for nudes, they can be both the problem and the solution.

To sum it up, like in the animate movie ‘The Incredibles’: ‘If everyone’s special, then no one is.’ If no image can be trusted, no excuse can be doubted. ‘It wasn’t me’ becomes the go-to, and nobody needs to feel ashamed or suicidal over something fake that happens to many.

Of course, this is oversimplifying things in the real world but society will adjust. People won’t kill themselves over this. It might even be a good thing for those on the cusp of AI and improper real world behaviours - ‘Its not me. Its clearly AI, I would never behave so outrageously’.

spez_@lemmy.world on 31 Mar 2024 06:06 collapse

The technology will become available everywhere and run on every device over time. Nothing will stop this

Mastengwe@lemm.ee on 30 Mar 2024 05:19 next collapse

As long as there are simps, there will always be this bullshit. And there will always be simps, because it isn’t illegal to be pathetic.

ItsAFake@lemmus.org on 30 Mar 2024 06:33 collapse

Come on, prisons are over populated as it is, if that happens then you me and everyone here are fucked.

Mastengwe@lemm.ee on 30 Mar 2024 15:34 collapse

I’m not a simp so it’s not a problem for me.

RoseTintedGlasses@lemmy.blahaj.zone on 30 Mar 2024 08:57 next collapse

We need to shut this whole coomer thing down until we work out wtf is going on in their brains.

RoseTintedGlasses@lemmy.blahaj.zone on 30 Mar 2024 08:58 collapse

Unironically though, anyone who does this should just be locked up

boatsnhos931@lemmy.world on 30 Mar 2024 14:36 next collapse

Oooo puter’ bobs and vagenes. Scissor me timbers that gets me hot n bothered

anticurrent@sh.itjust.works on 30 Mar 2024 19:39 next collapse

We are acting as if through out history we managed to orient technology so as to to only keep the benefits and eliminate negative effects. while in reality most of the technology we use still comes with both aspects. and it is not gonna be different with AI.

ulterno@lemmy.kde.social on 30 Mar 2024 19:51 next collapse

I say stop antagonizing the AI.
The only difference between a skilled artist making it with Photoshop and someone using a Neural Net, is the amount of time and effort put into creating the instance.

If there are to be any laws against these, they need to apply to any and every fake that’s convincing enough, no matter the technology used.


Don’t blame the gunpowder for the dead person.
People were being killed by thrown stones before that.

Ultragigagigantic@lemmy.world on 31 Mar 2024 02:03 collapse

The laws that oppress us on a daily basis suck ass I’ll give yall that for fucking sure… but downvoting someone wishing for the law equally being applied to all?

Maybe I should go back to 4chan.

ulterno@lemmy.kde.social on 31 Mar 2024 05:42 collapse

OooOo!
That’s some high number of dwnv0t3s!
I wouldn’t have realised unless you had replied here.

Nice, but it’s also good that everyone is at least free to downvote and see the number of downvotes, unlike YouTube.


All over history, there has been this trend of people misusing technology and then blaming the technology instead of those that misuse it. This trend is detrimental to the technological progress of a civilisation and is one of the driving forces, causing the cycle that our civilisation is stuck in (of losing all tech and history every once a while and then having to start over from the dark ages).
Technology, gives someone the ability to do something, but it is their will that makes them want to do so. If the “something” is considered “bad” for society, then instead of taking away the ability, we need to insist on getting the person to understand, why and how, said “something” is a problem for the society.

Until this problem is fixed, we are going to be stuck at the barrier and the next levels of civilisation shall stay a part of Fiction.

afraid_of_zombies@lemmy.world on 31 Mar 2024 01:47 next collapse

It’s stuff like this that makes me against copyright laws. To me it is clear and obvious that you own your own image, and it is far less obvious to me that a company can own an image whose creator drew multiple decades ago that everyone can identify. And yet one is protected and the other isn’t.

What the hell do you own if not yourself? How come a corporation has more rights than we do?

LodeMike@lemmy.today on 31 Mar 2024 04:30 collapse

This stuff should be defamation, full stop. There would need to be a law specifically saying you can’t sign it away, though.

spez_@lemmy.world on 31 Mar 2024 06:05 collapse

Get out of the way of progress

Ultragigagigantic@lemmy.world on 31 Mar 2024 01:58 next collapse

It’s gonna suck no matter what once the technology became available. Perhaps in a bunch of generations there will be a massive cultural shift to something less toxic.

May as well drink the poison if I’m gonna be immersed in it. Cheers.

VinnyDaCat@lemmy.world on 31 Mar 2024 06:10 collapse

I was really hoping that with the onset of AI people would be more skeptical of content they see online.

This was one of the reasons. I don’t think there’s anything we can do to prevent people from acting like this, but what we can do as a society is adjust to it so that it’s not as harmful. I’m still hoping that the eventual onset of it becoming easily accessible and useable will help people to look at all content much more closely.

Kuinox@lemmy.world on 31 Mar 2024 06:21 next collapse

The root problem is government not enforcing the law on internet. Deepfakes existed for years.
The law enforcement should be more proactive on internet.

Cris_Color@lemmy.world on 31 Mar 2024 06:43 collapse

God, generative ai is such a fucking caustic technology. I honestly don’t see anything positive and not disgusting enabled by this tech.

Edit: I see people don’t agree, but like why can’t ai stick to translating stuff and being useful rather than making horrifically unethical porn, taking the humanity out of art, and replacing peoples jobs with statistical content generation. I hate it here.

reddithalation@sopuli.xyz on 31 Mar 2024 07:06 next collapse

i liked ai when it was a bunch of researchers messing around, but commercialized ai is horrifying.

Mubelotix@jlai.lu on 31 Mar 2024 08:06 collapse

You can call people disgusting over what they do with a tool, but the tool itself is just a tool, it can’t be disgusting

Cris_Color@lemmy.world on 31 Mar 2024 08:20 collapse

The distinction is that I can see worthwhile use cases for non-generative ai, and not for generative ai, and generative ai is built on theft of creative labor

I’m not angry at people who use generative ai, I’m angry at the people who built it by stealing from creatives to build a commercial tool that can seemingly only be used in awful ways.

Mubelotix@jlai.lu on 31 Mar 2024 10:29 collapse

That I can understand