Dsklnsadog@lemmy.dbzer0.com
on 27 Feb 2025 19:29
nextcollapse
Oh, I’m sorry, I didn’t know. That perfectly proves my point that some people need to touch some grass, but anyway, I’m not here to judge anyone’s life, it was just a little piece of advice. That’s all for me. Have a good one.
Let me guess, your parents don’t let you watch movies or play video games so you have anger issues towards adults that have done so their entire lives. That’s too bad.
Dsklnsadog@lemmy.dbzer0.com
on 27 Feb 2025 20:53
collapse
Oh, you wanna talk about me? My parents bought a chipped PS1 when I was a kid and love it. Now I have a PS5 (used to play AC, GTA, and FIFA, but I got bored, so it became a “youtube frontend”). I had a cable TV and a PC with Windows 95/98/XP/Vista (acording time) and later Ubuntu. Great childhood, but I also had friends, used to play football every single week day (the one with the foot in the ball). Now I only play once a week in my local team (But I do it really bad, so I play defense).
Anything more you wanna know? I’m all for sharing! =)
I know, i know, too much about me. Have a great day. And next time just ignore or take the advice. If you feel I’m trolling, just don’t feed me. Bye!!
L0rdMathias@sh.itjust.works
on 27 Feb 2025 19:35
collapse
Based on their response habits, it’s likely a poorly made AI or a 13 year old kid. Not worth interacting with it cuz either: it is incable or caring, or they really aren’t supposed to be here and we really shouldn’t welcome children into adult spaces by allowing them into the conversation. If they don’t wanna have discussion, then why would they contribute to conversation when spoken to?
Dsklnsadog@lemmy.dbzer0.com
on 27 Feb 2025 20:00
collapse
The advice was “touch” not “smoke” but at least you were out for some time.
L0rdMathias@sh.itjust.works
on 27 Feb 2025 19:37
nextcollapse
Left of rei. Likely her twin sister Ram is to the left of her, but iirc they have different color hairbands so it’s probably just two copies of Rem, so like [Rem, Rem, Rei]
davidgro@lemmy.world
on 27 Feb 2025 19:54
nextcollapse
Character from an anime called Re:Zero
Edit: I guess I got whooshed? I haven’t seen the show yet, just know what some characters look like.
Spawn7586@lemmy.world
on 27 Feb 2025 21:10
nextcollapse
That was so high level it went past most of them. Take my upvote
thisistricky@lemm.ee
on 27 Feb 2025 22:16
collapse
Dolphinfreetuna@lemmy.world
on 27 Feb 2025 22:15
nextcollapse
#1 is markiplier
TheFogan@programming.dev
on 27 Feb 2025 23:40
nextcollapse
To which I have to say… good on them for using AI porn in the least bad way? (IE realistic fictional characters instead of real people that did not consent to the depictions being made of them).
brucethemoose@lemmy.world
on 28 Feb 2025 23:03
collapse
That’s mostly what AI Porn land is now, as animation is easier and hence fictional characters are easier. Trainers also tend to exclude real people from base models.
lefixxx@lemmy.world
on 28 Feb 2025 06:43
nextcollapse
JoMiran@lemmy.ml
on 27 Feb 2025 18:48
nextcollapse
I am going to make this statement openly on the Internet. Feel free to make AI generated porn of me as long as it involves adults. Nobody is going to believe that a video of me getting railed by a pink wolf furry is real. Everyone knows I’m not that lucky.
sunzu2@thebrainbin.org
on 27 Feb 2025 18:52
nextcollapse
Yeah after certain point this shit doesn't really do anything... But younger people and mentally vulnerable people gonna be rocked by this tech lol
Fortunately, most of my family is so tech illiterate that even if a real video got out, I could just tell them it’s a deepfake and they’d probably believe me.
DoucheBagMcSwag@lemmy.dbzer0.com
on 27 Feb 2025 18:57
nextcollapse
Nice
ThePantser@lemmy.world
on 27 Feb 2025 19:59
nextcollapse
Ok I checked it out, why are there so many people with boobs and with 3 foot cocks? Talk about unrealistic body expectations.
MoonlightFox@lemmy.world
on 27 Feb 2025 21:22
nextcollapse
First off, I am sex positive, pro porn, pro sex work, and don’t believe sex work should be shameful, and that there is nothing wrong about buying intimacy from a willing seller.
That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.
I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.
TheGrandNagus@lemmy.world
on 27 Feb 2025 21:54
nextcollapse
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.
I’ve been thinking about this recently too, and I have similar feelings.
I’m just gonna come out and say it without beating around the bush: what is the law’s position on AI-generated child porn?
More importantly, what should it be?
It goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn’t?
If we’re basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour (i.e. increasing demand), or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour (I.e. lowering demand).
And to know that, we’d need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don’t know whether it would or won’t), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody’s touching that with a ten foot pole.
WeirdGoesPro@lemmy.dbzer0.com
on 27 Feb 2025 22:27
nextcollapse
It’s so much simpler than that—it can be created now, so it will be. They will use narrative twists to post it on the clearnet, just like they do with anime (she’s really a 1000 year old vampire, etc.). Creating laws to allow it are simply setting the rules of the phenomenon that is already going to be happening.
The only question is whether or not politicians will stop mud slinging long enough to have an adult conversation, or will we just shove everything into the more obscure parts of the internet and let it police itself.
UltraGiGaGigantic@lemmy.ml
on 28 Feb 2025 22:06
collapse
No adult conversation required, just a quick “looks like we don’t get internet privacy after all everyone.” And erosion of more civil liberties. Again.
michaelmrose@lemmy.world
on 28 Feb 2025 00:00
nextcollapse
Let’s play devils advocate. You find Bob the pedophile with pictures depicting horrible things. 2 things are true.
Although you can’t necessarily help Bob you can lock him up preventing him from doing harm and permanently brand him as a dangerous person making it less likely for actual children to be harmed.
Bob can’t claim actual depictions of abuse are AI generated and force you to find the unknown victim before you can lock him and his confederates up. If the law doesn’t distinguish between simulated and actual abuse then in both cases Bob just goes to jail.
A third factor is that this technology and the inherent lack of privacy on the internet could potentially pinpoint numerous unknown pedophiles who can even if they haven’t done any harm yet be profitably persecuted to societies ultimate profit so long as you value innocent kids more than perverts.
shalafi@lemmy.world
on 28 Feb 2025 02:10
nextcollapse
Am I reading this right? You’re for prosecuting people who have broken no laws?
I’ll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?
This sounds like some Minority Report hellscape society.
Clent@lemmy.dbzer0.com
on 28 Feb 2025 03:45
nextcollapse
Correct. This quickly approaches thought crime.
What about an AI gen of a violent rape and murder. Shouldn’t that also be illegal.
But we have movies that have protected that sort of thing for years; graphically. Do those the become illegal after the fact?
And we also have movies of children being victimized so do these likewise become illegal?
We already have studies that show watching violence does not make one violent and while some refuse to accept that, it is well established science.
There is no reason to believe the same isn’t true for watching sexual assault. There are been many many movies that contain such scenes.
But ultimately the issue will become that there is no way to prevent it. The hardware to generate this stuff is already in our pockets. It may not be efficient but it’s possible and efficiency will increase.
The prompts to generate this stuff are easily shared and there is no way to stop that without monitoring all communication and even then I’m sure work around would occur.
Prohibition requires society sacrifice freedoms and we have to decide what weee willing to sacrifice here because as we’ve seen with or prohibitions, once we unleash the law on one, it can be impossible to undo.
michaelmrose@lemmy.world
on 28 Feb 2025 05:28
collapse
Ok watch adult porn then watch a movie in which women or children are abused. Note how the abuse is in no way sexualized exactly opposite of porn. It often likely takes place off screen and when rape in general appears on screen between zero and no nudity co-occurs. For children it basically always happens off screen.
Simulated child abuse has been federally illegal for ~20 years in the US and we appear to have very little trouble telling the difference between prosecuting pedos and cinema even whilst we have struggled enough with sexuality in general.
But ultimately the issue will become that there is no way to prevent it.
This argument works well enough for actual child porn. We certainly don’t catch it all but every prosecution takes one more pedo off the streets. The net effect is positive. We don’t catch most car thieves either and nobody suggests we legalize car theft.
michaelmrose@lemmy.world
on 28 Feb 2025 05:06
collapse
Am I reading this right? You’re for prosecuting people who have broken no laws?
No I’m for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal. Notably you are free to fantasize about whatever you like its the actual creation and sharing of images that would be illegal. Far from being a minority report hellscape its literally the present way things already are many places.
Lol, how can you say that do confidently? How would you know that with fewer AI CP you get less abused kids? And what is the logic behind it?
Demand doesn’t really drop if something is illegal (same goes for drugs). The only thing you reduce is offering, which just resulting in making the thing that got illegal more valuable (this wakes attention of shady money grabbers that hate regulation / give a shit about law enforcement and therefore do illegal stuff to get money) and that you have to pay a shitton of government money maintaining all the prisons.
michaelmrose@lemmy.world
on 28 Feb 2025 18:10
collapse
Basically every pedo in prison is one who isn’t abusing kids. Every pedo on a list is one who won’t be left alone with a young family member. Actually reducing AI CP doesn’t actually by itself do anything.
AwesomeLowlander@sh.itjust.works
on 28 Feb 2025 23:43
collapse
Wrong. Every pedo in prison is one WHO HAS ALREADY ABUSED A CHILD, whether directly or indirectly. There is an argument to be made, and some studies that show, that dealing with Minor Attracted People before they cross the line can be effective. Unfortunately, to do this we need to be able to have a logical and civil conversation about the topic, and the current political climate does not allow for that conversation to be had. The consequence is that preventable crimes are not being prevented, and more children are suffering for it in the long run.
michaelmrose@lemmy.world
on 01 Mar 2025 09:22
collapse
People are locked up all the time for just possessing child porn without having abused anyone. This isn’t a bad thing because they are a danger to society.
AwesomeLowlander@sh.itjust.works
on 01 Mar 2025 11:14
collapse
No, they are not locked up because they’re a danger to society. They’re locked up because possessing CP is indirectly contributing to the abuse of the child involved.
MoonlightFox@lemmy.world
on 28 Feb 2025 02:26
collapse
Good arguments. I think I am convinced that both cases should be illegal.
If the pictures are real they probably increase demand, which is harmful. If the person knew, then the action therefore should result in jail and forced therapy.
If the pictures are not, forced therapy is probably the best option.
So I guess it being illegal in most cases simply to force therapy is the way to go. Even if it in one case is “victimless”. If they don’t manage to plausibly seem rehabilitated by professionals, then jail time for them.
I would assume (but don’t really know) most pedophiles don’t truly want to act on it, and don’t want to have those urges. And would voluntarily go to therapy.
Which is why I am convinced prevention is the way to go. Not sacrificing privacy. In Norway we have anonymous ways for pedophiles to seek help. There have been posters and ads for it a lot of places a year back or something. I have not researched how it works in practice though.
Edit: I don’t think the therapy we have in Norway is conversion therapy. It’s about minimizing risk and helping deal with the underlying causes, medication, childhood trauma etc. I am not necessarily convinced that conversion therapy works.
foggenbooty@lemmy.world
on 28 Feb 2025 17:47
nextcollapse
Therapy is well and good and I think we need far more avenues available for people to get help (for all issues). That said, sexuality and attraction are complicated.
Let me start by saying I am not trying to state there is a 1:1 equivalence, this is just a comparison, but we have long abandoned conversion therapy for homosexuals, because we’ve found these preferences are core to them and not easily overwritten. The same is true for me as a straight person, I don’t think therapy would help me find men attractive. I have to imagine the same is true for pedophiles.
The question is, if AI can produce pornography that can satisfy the urges of someone with pedophilia without harming any minors, is that a net positive? Remember the attraction is not the crime, it’s the actions that harm others that are. Therapy should always be on the table.
This is a tricky subject because we don’t want to become thought police, so all our laws are built in that manner. However there are big exceptions for sexual crimes due to the gravity of their impact on society. It’s very hard to “stand up” for pedophilia because if acted upon it has monstrous effects, but AI is making us open this can of worms that I don’t belive we ever really thought through besides criminalizing and demonizing (which could be argued was the correct approach with the technology at the time).
MoonlightFox@lemmy.world
on 28 Feb 2025 18:02
nextcollapse
I really don’t know enough about the subject or how that therapy works. I doubt that it is conversion therapy, but Ii really don’t know. I would assume it’s handling childhood trauma, medications etc.
Both therapy and if satisfying urges through AI generated content, is both something that should be answered scientifically. If there is research then that should be the basis for what decisions is taken, if there is a lack of research then more research should be the next step.
ubergeek@lemmy.today
on 28 Feb 2025 20:05
collapse
That said, sexuality and attraction are complicated.
There’s nothing particularly complicated about it not being ok to rape kids, or to distribute depictions of kids being raped.
foggenbooty@lemmy.world
on 28 Feb 2025 23:01
collapse
Nobody here has at all suggested it’s ok to rape kids. I hope you can understand the difference between thinking something and doing something.
spireghost@lemmy.zip
on 28 Feb 2025 18:06
collapse
If the pictures are not, forced therapy is probably the best option.
This is true, but it really depends how “therapy” is defined. And forced therapy could mean anything from things like the old conversion therapy approach for gay people.
You might argue that these days we wouldn’t do anything so barbaric, but considering that the nature of a pedophile’s is very unsavory, and the fact that it will never be acceptable, unlike homosexuality, people would be far more willing to abuse or exploit said “therapy”
KillingTimeItself@lemmy.dbzer0.com
on 28 Feb 2025 03:55
nextcollapse
what is the law’s position on AI-generated child porn?
the simplest possible explanation here, is that any porn created based on images of children, is de facto illegal. If it’s trained on adults explicitly, and you prompt it for child porn, that’s a grey area, probably going to follow precedent for drawn art, rather than real content.
Aggravationstation@feddit.uk
on 28 Feb 2025 06:51
nextcollapse
what is the law’s position on AI-generated child porn?
Illegal is most of the west already as creating sexual assault material of minors is already illegal regardless of method.
General_Effort@lemmy.world
on 28 Feb 2025 12:40
nextcollapse
what is the law’s position on AI-generated child porn?
Pretend underage porn is illegal in the EU and some other countries. I believe, in the US it is protected by the first amendment.
Mind that when people talk about child porn or CSAM that means anything underage, as far as politics is concerned. When two 17-year-olds exchange nude selfies, that is child porn. There were some publicized cases of teens in the US being convicted as pedophile sex offenders for sexting.
ubergeek@lemmy.today
on 28 Feb 2025 20:01
collapse
I believe, in the US it is protected by the first amendment.
CSAM, artificial or not, is illegal in the United States.
General_Effort@lemmy.world
on 01 Mar 2025 11:28
collapse
I see. I’ve looked up the details. Obscenity - whatever that means - is not protected by the first amendment. So where the material is obscene, it is still illegal.
I think the concern is that although it’s victimless, if it’s legal it could… Normalise (within certain circles) the practice. This might make the users more confident to do something that does create a victim.
Additionally, how do you tell if it’s really or generated? If AI does get better, how do you tell?
surewhynotlem@lemmy.world
on 28 Feb 2025 01:06
nextcollapse
without a victim
It was trained on something.
MoonlightFox@lemmy.world
on 28 Feb 2025 01:26
nextcollapse
It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.
However the act of generating something does not create a new victim unless there is someones likeness and it is shared? Or is there something ethical here, that I am missing?
(Yes, all current AI is basically collective piracy of everyones IP, but besides that)
surewhynotlem@lemmy.world
on 28 Feb 2025 02:26
nextcollapse
Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.
So take that video and modify it a bit. Color correct or something. That’s still abuse, right?
So the question is, at what point in modifying the video does it become not abuse? When you can’t recognize the person? But I think simply blurring the face wouldn’t suffice. So when?
That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?
I can’t make that call. And because I can’t make that call, I can’t support the concept.
KairuByte@lemmy.dbzer0.com
on 28 Feb 2025 03:15
nextcollapse
I mean, there’s another side to this.
Assume you have exacting control of training data. You give it consensual sexual play, including rough play, bdsm play, and cnc play. We are 100% certain the content is consensual in this hypothetical.
Is the output a grey area, even if it seems like real rape?
Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?
Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?
We can build on that further. What if they take the time to animate this scene? Is that a grey area?
When does the above cross into a problem? Is it the AI making something that seems like rape but is built on consensual content? The thought of a person imagining a real rape? The putting of that thought onto a still image? The animating?
Or is it none of them?
KillingTimeItself@lemmy.dbzer0.com
on 28 Feb 2025 03:54
nextcollapse
Is the output a grey area, even if it seems like real rape?
on a base semantic and mechanic level, no, not at all. They aren’t real people, there aren’t any victims involved, and there aren’t any perpetrators. You might even be able to argue the opposite, that this is actually a net positive, because it prevents people from consuming real abuse.
Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?
until you can either publicly display yours, or someone else process of thought, or read peoples minds, definitionally, this is an impossible question to answer. So the default is no, because it’s not possible to be based in any frame of reality.
Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?
assuming it depicts no real persons or identities, no, there is nothing necessarily wrong about this, in fact i would defer back to the first answer for this one.
We can build on that further. What if they take the time to animate this scene? Is that a grey area?
this is the same as the previous question, media format makes no difference, it’s telling the same story.
When does the above cross into a problem?
most people would argue, and i think precedent would probably agree, that this would start to be a problem when explicit external influences are a part of the motivation, rather than an explicitly internally motivated process. There is necessarily a morality line that must be crossed to become a more negative thing, than it is a positive thing. The question is how to define that line in regards to AI.
Clent@lemmy.dbzer0.com
on 28 Feb 2025 03:55
nextcollapse
We already allow simulated rape in tv and movies. AI simply allows a more graphical portrayal.
surewhynotlem@lemmy.world
on 28 Feb 2025 11:34
collapse
Consensual training data makes it ok. I think AI companies should be accountable for curating inputs.
Any art is ok as long as the artist consents. Even if they’re drawing horrible things, it’s just a drawing.
Now the real question is, should we include rapes of people who have died and have no family? Because then you can’t even argue increased suffering of the victim.
But maybe this just gets solved by curation and the “don’t be a dick” rule. Because the above sounds kinda dickish.
MoonlightFox@lemmy.world
on 28 Feb 2025 03:27
nextcollapse
I see the issue with how much of a crime is enough for it to be okay, and the gray area. I can’t make that call either, but I kinda disagree with the black and white conclusion. I don’t need something to be perfectly ethical, few things are. I do however want to act in a ethical manner, and strive to be better.
Where do you draw the line?
It sounds like you mean no AI can be used in any cases, unless all the material has been carefully vetted?
I highly doubt there isn’t illegal content in most AI models of any size by big tech.
I am not sure where I draw the line, but I do want to use AI services, but not for porn though.
surewhynotlem@lemmy.world
on 28 Feb 2025 11:30
collapse
It just means I don’t use AI to create porn. I figure that’s as good as it gets.
dubyakay@lemmy.ca
on 28 Feb 2025 03:28
nextcollapse
It’s not just AI that can create content like that though. 3d artists have been making victimless rape slop of your vidya waifu for well over a decade now.
surewhynotlem@lemmy.world
on 28 Feb 2025 11:29
collapse
Yeah, I’m ok with that.
AI doesn’t create, it modifies. You might argue that humans are the same, but I think that’d be a dismal view of human creativity. But then we’re getting weirdly philosophical.
KillingTimeItself@lemmy.dbzer0.com
on 28 Feb 2025 03:47
nextcollapse
Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.
is this a legal thing? I’m not familiar with the laws surrounding sexual abuse, on account of the fact that i don’t frequently sexually abuse people, but if this is an established legal precedent that’s definitely a good argument to use.
However, on a mechanical level. A recounting of an instance isn’t necessarily a 1:1 retelling of that instance. A video of rape for example, isn’t abuse anymore so than the act of rape within it, and of course the nonconsensual recording and sharing of it (because it’s rape) distribution of that could necessarily be considered a crime of it’s own, same with possession, however interacting with the video i’m not sure is necessarily abuse in it’s own right, based on semantics. The video most certainly contains abuse, the watcher of the video may or may not like that, i’m not sure whether or that should influence that, because that’s an external value. Something like “X person thought about raping Y person, and got off to it” would also be abuse under the same pretense at a certain point. There is certainly some interesting nuance here.
If i watch someone murder someone else, at what point do i become an accomplice to murder, rather than an additional victim in the chain. That’s the sort of litmus test this is going to require.
That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere).
to be clear, this would be a statistically minimal amount of abuse, the vast majority of adult content is going to be legally produced and sanctioned, made public by the creators of those videos for the purposes of generating revenue. I guess the real question here, is what percent of X is still considered to be “original” enough to count as the same thing.
Like we’re talking probably less than 1% of all public porn, but a significant margin, is non consensual (we will use this as the base) and the AI is trained on this set, to produce a minimally alike, or entirely distinct image from the feature set provided. So you could theoretically create a formula to determine how far removed you are from the original content in 1% of cases. I would imagine this is going to be a lot closer to 0 than it is to any significant number, unless you start including external factors, like intentionally deepfaking someone into it for example. That would be my primary concern.
That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?
another important concept here is human behavior as it’s conceptually similar in concept to the AI in question, there are clear strict laws regarding most of these things in real life, but we aren’t talking about real life. What if i had someone in my family, who got raped at some point in their life, and this has happened to several other members of my family, or friends of mine, and i decide to write a book, loosely based on the experiences of these individuals (this isn’t necessarily going to be based on those instances for example, however it will most certainly be influenced by them)
There’s a hugely complex hugely messy set of questions, and answers that need to be given about this. A lot of people are operating on a set of principles much too simple to be able to make any conclusive judgement about this sort of thing. Which is why this kind of discussion is ultimately important.
With this logic, any output of any pic gen AI is abuse… I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.
surewhynotlem@lemmy.world
on 28 Feb 2025 11:27
nextcollapse
We could be sure of it if AI curated it’s inputs, which really isn’t too much to ask.
Well AI is by design not able to curate its training data, but companies training the models would in theory be able to. But it is not feasible to sanitise this huge stack of data.
ubergeek@lemmy.today
on 28 Feb 2025 20:09
nextcollapse
With this logic, any output of any pic gen AI is abuse
Yes?
UltraGiGaGigantic@lemmy.ml
on 28 Feb 2025 22:28
collapse
There is no ethical consumption while living a capitalist way of life.
But to your argument: It is perfectly possible to tune capitalism using laws to get veerry social.
I mean every “actually existing communist country” is in its core still a capitalist system, or how you argue against that?
KillingTimeItself@lemmy.dbzer0.com
on 28 Feb 2025 03:36
collapse
It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.
the downlow of it is quite simple, if the content is public, available for anybody to consume, and copyright permits it (i don’t see why it shouldn’t in most cases, although if you make porn for money, you probably hold exclusive rights to it, and you probably have a decent position to begin from, though a lengthy uphill battle nonetheless.) there’s not really an argument against that. The biggest problem is identity theft and impersonation, more so than stealing work.
KillingTimeItself@lemmy.dbzer0.com
on 28 Feb 2025 03:34
collapse
yeah bro wait until you discover where neural networks got that idea from
KillingTimeItself@lemmy.dbzer0.com
on 28 Feb 2025 03:34
nextcollapse
i have no problem with ai porn assuming it’s not based on any real identities, i think that should be considered identity theft or impersonation or something.
Outside of that, it’s more complicated, but i don’t think it’s a net negative, people will still thrive in the porn industry, it’s been around since it’s been possible, i don’t see why it wouldn’t continue.
drmoose@lemmy.world
on 28 Feb 2025 03:57
nextcollapse
Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?
KillingTimeItself@lemmy.dbzer0.com
on 28 Feb 2025 04:02
nextcollapse
revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.
To be clear, you’re example is a sketch of johnny depp, i’m talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.
KillingTimeItself@lemmy.dbzer0.com
on 28 Feb 2025 04:07
nextcollapse
sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.
Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That’s considered identity theft/fraud when we do it with legally identifying papers, it’s a similar case here i think.
But the thing is it’s not a relevant law here at all as nothing is being distributed and no one is being harmed. Would you say the same thing if AI is not involved? Sure it can be creepy and weird and whatnot but it’s not inhertly harmful or at least it’s not obvious how it would be.
KillingTimeItself@lemmy.dbzer0.com
on 04 Mar 2025 01:10
collapse
the only perceivable reason to create these videos is either for private consumption, in which case, who gives a fuck. Or for public distribution, otherwise you wouldn’t create them. And you’d have to be a bit of a weird breed to create AI porn of specific people for private consumption.
If AI isn’t involved, the same general principles would apply, except it might include more people now.
I’ve been thinking about this more and I think one interesting argument here is “toxic culture growth”. As in even if the thing is not distributed it might grow undesired toxic cultures througu indirect exposures (like social media or forums discussions) even without the direct sharing.
I think this is slippery to the point of government mind control but maybe there’s something valuable to research here either way.
KillingTimeItself@lemmy.dbzer0.com
on 05 Mar 2025 01:22
collapse
I’ve been thinking about this more and I think one interesting argument here is “toxic culture growth”. As in even if the thing is not distributed it might grow undesired toxic cultures through indirect exposures (like social media or forums discussions) even without the direct sharing.
this is another big potential as well. Does it perpetuate cultural behaviors that you want to see in society at large? Similar things like this have resulted from misogyny and the relevant history of feminism.
It’s a whole thing.
I think this is slippery to the point of government mind control but maybe there’s something valuable to research here either way.
i think there is probably a level of government regulation that is productive, i’m just curious about how we even define where that line starts and ends, because there is not really an explicitly clear point, unless you have solid external inferences to start from.
Honestly I’m quite happy with “social justice warrior” approach. Sure it’s flawed and weak to manipulation for now but as a strategy for society to self correct its quite brilliant.
I’m optimistic society itself should be able to correct itself for this issue as well though considering the current climate the correction might be very chaotic.
KillingTimeItself@lemmy.dbzer0.com
on 07 Mar 2025 01:04
collapse
i mean, i’m not sure modern social justice is working as intended given the political landscape, but historically small communities do manage to self regulate very effectively, that one is for sure. I will give you that.
The only effective way to mandate something at a societal level is going to be laws, i.e. government, otherwise you’re going to have an extremely disjointed and culturally diverse society, which isn’t necessarily a bad thing.
iAvicenna@lemmy.world
on 28 Feb 2025 13:07
collapse
I guess the point is this enables the mass production of revenge porn essentially at a person on the street level which makes it much harder to punish and prevent distribution. when it is relatively few sources that produces the unwanted product then only punishing the distribution might be a viable method. But when the production method becomes available to the masses then the only feasible control mechanism is to try to regulate the production method. It is all a matter of where is the most efficient position to put the bottle neck.
For instance when 3D printing allows people to produce automatic rifles in their homes “saying civil use of automatic rifles is illegal so that is fine” is useless.
I think that’s a fair point and I wonder how will this effect the freedom of expression on the internet. If you can’t find the distributor then it’ll be really tough to get a handle of this.
On the other hand the sheer over abundance could simply break the entire value of revenge porn as in “nothing is real anyway so it doesn’t matter” sort of thing which I hope would be the case. No one will be watching revenge porn cause they generate any porn they want in a heartbeat. Thats the ideal scenario anyway.
iAvicenna@lemmy.world
on 28 Feb 2025 16:48
collapse
It is indeed a complicated problem with many intertwined variables, wouldn’t wanna be in the shoes of policy makers (assuming that they actually are searching for an honest solution and not trying to turn this into profit lol).
For instance too much regulation on fields like this essentially would kill high quality open source AI tools and make most of them proprietary software leaving the field in the mercy of tech monopolies. This is probably what these monopolies want and they will surely try to push things this way to kill competition (talk about capitalism spurring competition and innovation!). They might even don the cloak of some of these bad actors to speed up the process. Given the possible application range of AI, this is probably even more dangerous than flooding the internet with revenge porn.
%100 freedom, no regulations will essentially lead to a mixed situation of creative and possibly ground breaking uses of the tech vs many bad actors using the tech for things like scamming, disinformation etc. how it will balance out on the long run is probably very hard to predict.
I think two things are clear, 1-both extremities are not ideal, 2- between the two extremities %100 freedom is still the better option (the former just exchanges many small bad actors for a couple giant bad actors and chokes any possible good outcomes).
Based on these starting with a solution closer to the “freedom edge” and improving it step by step based on results is probably the most sensible approach.
iAvicenna@lemmy.world
on 28 Feb 2025 11:40
collapse
ubergeek@lemmy.today
on 28 Feb 2025 20:07
nextcollapse
i have no problem with ai porn assuming it’s not based on any real identities
With any model in use, currently, that is impossible to meet. All models are trained on real images.
KillingTimeItself@lemmy.dbzer0.com
on 01 Mar 2025 01:38
collapse
With any model in use, currently, that is impossible to meet. All models are trained on real images.
yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?
You are literally using the schizo argument right now. “If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness”
ubergeek@lemmy.today
on 01 Mar 2025 13:02
collapse
No, the problem is a lack of consent of the person being used.
And now, being used to generate depictions of rape and CSAM.
KillingTimeItself@lemmy.dbzer0.com
on 04 Mar 2025 00:47
collapse
yeah but like, legally, is this even a valid argument? Sure there is techically probably like 0.0001% of the average person being used in any given result of an AI generated image. I don’t think that gives anyone explicit rights to that portion however.
That’s like arguing that a photographer who captured you in a random photo in public that became super famous is now required to pay you royalties for being in that image, even though you are literally just a random fucking person.
You can argue about consent all you want, but at the end of the day if you’re posting images of yourself online, you are consenting to other people looking at them, at a minimum. Arguably implicitly consenting to other people being able to use those images. (because you can’t stop people from doing that, except for copyright, but that’s not very strict in most cases)
And now, being used to generate depictions of rape and CSAM.
i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim, otherwise it’s no different than me editing an image of someone to make it look like they got shot in the face. Is that shitty? Sure. But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.
The fundamental problem here is that you’re in an extremely uphill position to even begin the argument of “well it’s trained on people so therefore it uses the likeness of those people”
Does a facial structure recognition model use the likeness of other people? Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.
ubergeek@lemmy.today
on 04 Mar 2025 13:58
collapse
yeah but like, legally, is this even a valid argument?
Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.
Morally, that’s what you’re doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.
i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim,
It makes them a victim.
But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.
The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.
Does a facial structure recognition model use the likeness of other people?
Yes.
Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.
Exactly. So, without consent, it shouldn’t be used. Periodt.
KillingTimeItself@lemmy.dbzer0.com
on 05 Mar 2025 01:18
collapse
Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.
if you have schizoprehnia, sure. Legal is what the law defines as ok. Whether or not people get charged for it is another thing. The question is “do you have the legal right to do it or not”
Morally, that’s what you’re doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.
legally, the reasoning behind this is because it’s just extremely illegal, there are almost no if not zero, where it would be ok or reasonable, and therefore the moral framework tends to be developed around that. I don’t necessarily agree with it always being victimization, because there are select instances where it just doesn’t really make sense to consider it that, there are acts you commit that would make it victimization. However i like to subscribe to the philosophy that it is “abusive” material, and therefore innately wrong. Like blackmail, i find that to be a little bit more strict and conducive to that sort of definition.
It makes them a victim.
at one point in time yes, perpetually in some capacity, they will exist as having been a victim, or having been victimized at one point. I also don’t really consider it to be healthy or productive to engage in “once a victim always a victim” mentality, because i think it sets a questionable precedent for mental health care. Semantically someone who was a victim once, is still a victim of that specific event, however it’s a temporally relevant victimization, i just think people are getting a little loose on the usage of that word recently.
I’m still not sure how it makes that person a victim, unless it meets one of the described criteria i laid out, in which case it very explicitly becomes an abusive work. Otherwise it’s questionable how you would even attribute victimization to the victim in question, because there is no explicit victim to even consider. I guess you could consider everybody even remotely tangentially relevant to be a victim, but that then opens a massive blackhole of logical reasoning which can’t trivially be closed.
To propose a hypothetical here. Let’s say there is a person who we will call bob. Bob has created a depiction of “abuse” in such a horrendous manner that even laying your eyes upon such a work will forever ruin you. We will define the work in question to be a piece of art, depicting no person in particular, arguably barely resembling a person at all, however the specific definition remains to the reader. You could hypothetically in this instance argue that even viewing the work is capable of making people a “victim” to it. However you want to work that one out.
The problem here, is that bob hasn’t created this work in complete isolation, because he’s just a person, he interacts with people, has a family, has friends, acquaintances, he’s a normal person, aside from the horrors beyond human comprehension he has created. Therefore, in some capacity the influence of these people in his life, has to have influenced the work he engaged in on that piece. Are the people who know/knew bob, victims of this work as well, regardless of whether or not they have seen it, does the very act of being socially related to bob make them a victim of the work? For the purposes of the hypothetical we’ll assume they haven’t seen the work, and that he has only shown it to people he doesn’t personally know.
I would argue, and i think most people would agree with me, that there is no explicit tie in between the work that bob has created, and the people he knows personally. Therefore, it would be a stretch to argue that because those people were tangentially relevant to bob, are now victims, even though they have not been influenced by it. Could it influence them in some external way, possibly causing some sort of external reaction? Yeah, that’s a different story. We’re not worried about that.
This is essentially the problem we have with AI, there is no explicit resemblance to any given person (unless defined, which i have already explicitly opted out of) or it has inherently based the image off of via training (which i have also somewhat, explicitly opted out of as well) there are two fundamental problems here that need to be answered. First of all, how are these people being victimized? By posting images publicly on the internet? Seems like they consented to people at least being aware of them, if not to some degree manipulating images of them, because there is nothing to stop that from happening (as everyone already knows from NFTs) And second of all, how are we defining these victims? What’s the mechanism we use to determine the identity of these people, otherw
ubergeek@lemmy.today
on 05 Mar 2025 11:38
collapse
I refuse to debate ideas on how to make ethical CSAM with you.
Go find a pedo to figure out how you want to try to make CSAM, and you can well akshully all tou want.
KillingTimeItself@lemmy.dbzer0.com
on 07 Mar 2025 01:01
collapse
all of my arguments have explicitly removed any form of anything closely resembling CSAM to the point of being illegal under existing law, or at the very least, extremely questionable.
The only thing i haven’t excluded is the potential to use an AI trained explicitly on humans, with no children being in the dataset, being used to generate porn of someone “under the age of 18” which it has zero basis of reality on, and cannot functionally do. That would be the only actual argument i can think of where that wouldn’t already be illegal, or at least extremely questionable. Everything else i’ve provided a sufficient exclusion for.
Have fun calling me a pedo for no reason though, i guess.
UltraGiGaGigantic@lemmy.ml
on 28 Feb 2025 22:07
collapse
thispersondoesnotexist.com
Refresh for a new fake person
KillingTimeItself@lemmy.dbzer0.com
on 01 Mar 2025 01:35
collapse
this ones a classic.
frezik@midwest.social
on 28 Feb 2025 12:56
nextcollapse
I’ve found that there’s a lot of things on the Internet that went wrong because it was ad supported for “free”. Porn is one of them.
There is ethically produced porn out there, but you’re going to have to pay for it. Incidentally, it also tends to be better porn overall. The versions of the videos they put up on tube sites are usually cut down, and are only part of their complete library. Up through 2012 or so, the tube sites were mostly pirated content, but then they came to an agreement with the mainstream porn industry. Now it’s mostly the studios putting up their own content (plus independent, verified creators), and anything pirated gets taken down fast.
Anyway, sites like Crash Pad Series, Erika Lust, Vanessa Cliff, and Adulttime (the most mainstream of this list) are worth a subscription fee.
froggycar360@slrpnk.net
on 28 Feb 2025 22:13
nextcollapse
Whats illegal in real porn should be illegal in AI porn, since eventually we won’t know whether it’s AI
jacksilver@lemmy.world
on 01 Mar 2025 00:36
collapse
That’s the same as saying we shouldn’t be able to make videos with murder in them because there is no way to tell if they’re real or not.
froggycar360@slrpnk.net
on 02 Mar 2025 01:56
collapse
That’s a good point, but there’s much less of a market for murder video industry
jacksilver@lemmy.world
on 02 Mar 2025 02:35
collapse
I mean, a lot of TV has murders in it. There is a huge market for showing realistic murder.
But I get the feeling your saying that there isn’t a huge market for showing real people dying realistically without their permission. But that’s more a technicality. The question is, is the content or the production of the content illegal. If it’s not a real person, who is the victim of the crime.
froggycar360@slrpnk.net
on 02 Mar 2025 04:53
collapse
Yeah the latter. Also murder in films for the most part is for storytelling. It’s not murder simulations for serial killers to get off to, you know what I mean?
AI media models has to be trained on real media. The illegal content would mean illegal media and benefiting ,supporting, & profiting from and to victims of crime.
The lengths and fallacies pedophiles will go to justify themselves is absurd.
MoonlightFox@lemmy.world
on 01 Mar 2025 14:08
collapse
Excuse me? I am very offended by your insinuations here. It honestly makes me not want to share my thought and opinions at all. I am not in any way interested in this kind of content.
I encourage you to read my other posts in the different threads here and see. I am not an apologist, and do not condone it either.
I do genuinely believe AI can generate content it is not trained on, that’s why I claimed it can generate illegal content without a victim. Because it can combine stuff from things it is trained on and end up with something original.
I am interested in learning and discussing the consequences of an emerging and novel technology on society. This is a part of that. Even if it is uncomfortable to discuss.
You made me wish I didn’t…
yyprum@lemmy.dbzer0.com
on 01 Mar 2025 15:13
collapse
Don’t pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works.
You are right it is a victimless crime (for the creation of content). I could create porn with minions without using real minion porn to put the randomnest example I could think of. There’s the whole defamation thing of publishing content without someone’s permission but that I feel is a discussion irrelevant of AI (we could already create nasty images of someone before AI, AI just makes it easier). But using such content for personal use… It is victimless. I have a hard time thinking against it. Would availability of AI created content with unethical themes allow people to get that out of their system without creating victims? Would that make the far riskier and horrible business of creating illegal content with real unwilful people disappear? Or at the very least much more uncommon? Or would make people more willing to consume thw content creating a feelibg of fake safety towards content previously illegal? There’s a lot of implications that we should really be thinking about and how it would affect society, for better or worse…
MoonlightFox@lemmy.world
on 01 Mar 2025 15:26
collapse
Don’t pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works
Thank you 😊
Grandwolf319@sh.itjust.works
on 27 Feb 2025 22:58
nextcollapse
It’s time like this that I think, ahhh, the Internet never truly changed :D
LMurch@thelemmy.club
on 27 Feb 2025 23:10
nextcollapse
This is the way.
Krik@lemmy.dbzer0.com
on 27 Feb 2025 23:22
nextcollapse
Krik@lemmy.dbzer0.com
on 28 Feb 2025 00:27
nextcollapse
😂
billwashere@lemmy.world
on 28 Feb 2025 03:15
collapse
I feel like this should be a real tattoo….
Aggravationstation@feddit.uk
on 27 Feb 2025 23:42
nextcollapse
The year is 1997. A young boy is about to watch a porn video for the first time on a grainy VHS tape. An older version of himself from the far off year of 2025 appears.
Me: “You know, in the future, you’ll make your own porn videos.”
90s me: “Wow, you mean I’ll get to have lots of hot sex!?!?”
Me: “Ha! No. So Nvidia will release this system called CUDA…”
codexarcanum@lemmy.dbzer0.com
on 28 Feb 2025 01:13
nextcollapse
I thought this was going to go Watchmen for a moment. Like…
It is 1997, I am a young boy, I am jerking off to a grainy porno playing over stolen cinemax.
It is 2007, i am in my dorm, i am jerking off to a forum thread full of hi-res porno.
It is 2027, i am jerking off to an ai porno stream that mutates to my desires in real time. I am about to nut so hard that it shatters my perception of time.
ThePantser@lemmy.world
on 28 Feb 2025 03:34
nextcollapse
Oops the stream hallucinated and mutated into a horror show with women with nipples that are mouths and dicks with eyeballs.
Iheartcheese@lemmy.world
on 28 Feb 2025 03:46
nextcollapse
I am about to nut so hard that it shatters my perception of time.
100_kg_90_de_belin@feddit.it
on 28 Feb 2025 04:48
nextcollapse
I am about to nut so hard that it shatters my perception of time.
Did I stutter?
MaggiWuerze@feddit.org
on 28 Feb 2025 11:00
nextcollapse
Go on…
TeamAssimilation@infosec.pub
on 28 Feb 2025 12:05
nextcollapse
“Nothing else excites me anymore” - 2027 you, probably
I cannot for the life of me find it, but there was an article before the modern explosion of ML into the public consciousness where a guy found that one of the big search engines offered an API to “rate” nsfw images on a numeric scale. The idea was that site admins could use it to automatically filter content with a bit more granularity.
So naturally, he trained a naive image generating model with the nsfw score as the sole metric of success, and got results much like you describe. Results were pretty much “what if Dali and kronenberg had a baby with slaanesh?”
kilgore_trout@feddit.it
on 28 Feb 2025 10:04
nextcollapse
Where were you in 2017?
embed_me@programming.dev
on 28 Feb 2025 13:45
nextcollapse
Went to college
codexarcanum@lemmy.dbzer0.com
on 28 Feb 2025 13:54
nextcollapse
I figured rule of threes meant it was funnier to leave it out. 2017 would have been sad gooning to pornhub during the first trump nightmare.
Then 2027 could be sad gooning to ai hyperporn during the second trump nightmare.
Maybe I should have used 20 year jumps, but "2037, I am jerking off because there’s no food, and the internet is nothing but ai porn.’ didn’t seem as funny a point for the “time shattering” bit.
Jimmycrackcrack@lemmy.ml
on 02 Mar 2025 02:02
collapse
Look it was a long thread okay.
Zron@lemmy.world
on 28 Feb 2025 17:30
nextcollapse
The AI camera will still zoom in on the guys nuts as you’re about to bust tho.
irish_link@lemmy.world
on 28 Feb 2025 04:27
collapse
pop is funny
werefreeatlast@lemmy.world
on 28 Feb 2025 05:19
nextcollapse
I’m a representative of the American pornographical societies of America and I would like to educate myself and others in this disgusting AI de-generated graphical content for the purpose of informing my friends and family about this. Could we please have a link?
Sparkega@sh.itjust.works
on 28 Feb 2025 07:01
collapse
shades@lemmy.dbzer0.com
on 28 Feb 2025 03:52
nextcollapse
*Aliboobie
drmoose@lemmy.world
on 28 Feb 2025 03:56
nextcollapse
Good.
Hot take but I think AI porn will be revolutionary and mostly in a good way. Sex industry is extremely wasteful and inefficient use of our collective time that also often involves a lot of abuse and dark business practices. It’s just somehow taboo to even mention this.
JackFrostNCola@lemmy.world
on 28 Feb 2025 04:13
nextcollapse
Sometimes you come across a video and you are like ‘oh this. I need more of THIS.’
And then you start tailoring searches to try find more of the same but you keep getting generic or repeated results because the lack of well described/defined content overuse of video TAGs (obviously to try get more views with a large net rather than being specific).
But would i watch AI content if i could feed it a description of what i want? Hell yeah!
I mean there are only so many videos of girls giving a blowjob while he eats tacos and watches old Black & White documentaries about the advancements of mechanical production processes.
dumbass@leminal.space
on 28 Feb 2025 05:06
nextcollapse
I hate it when you find a good video, it does the job really well, so a few days later you’re like, yeah let’s watch that one again, you type in every single description you can about the video and find everything else but the video you want and they’re barely anywhere near as good.
Hell I’d take an AI porn search engine for now, let me describe in detail what the video I’m looking for is so I can finally see it again.
ghostfish@lemm.ee
on 28 Feb 2025 11:13
nextcollapse
Always, always download your favourite videos. Today they’re available, tomorrow you don’t know. (Remember the great PornHub purge? Pepperidge Farms remember)
Appoxo@lemmy.dbzer0.com
on 28 Feb 2025 12:33
collapse
Not to mention the users that may have a specific interest in some topic/action and basically all types of potential sources are locked bwhind paywalls.
eugenevdebs@lemmy.dbzer0.com
on 28 Feb 2025 22:30
collapse
Can confirm with my fetish. Some great artists and live actors who do it, but 90% of the content for it online is bad MS Paint level edits and horrid acting with props. That 10%? God tier, the community showers them in praise and commissions and only stop when they want to, unless a payment service like Visa or Patreon censors them and their livelihood as consenting adults.
If only there was a p2p way to send people funds without anyone knowing the sender or the recipient…
ZILtoid1991@lemmy.world
on 28 Feb 2025 12:27
nextcollapse
Yes, but will also hurt the very workers’ bottom line, and with some clever workarounds, it’ll be used to fabricate defamatory material other methods were not good at.
SoftestSapphic@lemmy.world
on 28 Feb 2025 17:01
collapse
Good, the sooner this system fails to provide for more people the sooner we form mobs and rid ourselvs of it.
The human race needs to progress to our next evolutionary step to a post scarcity approach to world economies
ZILtoid1991@lemmy.world
on 28 Feb 2025 18:16
collapse
AI won’t bring us a post scarcity world, but one with the upmost surveillance and “art” no longer made by artists.
SoftestSapphic@lemmy.world
on 28 Feb 2025 18:23
nextcollapse
We already live in a post scarcity world, we produce more than enough needs and goods for every person alive, we just throw away more than half of all food and clothing produced instead of giving it to the hungry because it’s isn’t profitable.
scottywh@lemmy.world
on 28 Feb 2025 19:41
collapse
*utmost
Regrettable_incident@lemmy.world
on 28 Feb 2025 18:38
collapse
I can see it being used to make images of people who don’t know and haven’t consented though and that’s pretty shit
lance20000@lemmy.ca
on 28 Feb 2025 20:31
nextcollapse
And clearly will be used for CP.
brucethemoose@lemmy.world
on 28 Feb 2025 22:54
collapse
Filters on online platforms are surprisingly effective, though. CivitAI is begging for it, and they’ve kept it out.
UltraGiGaGigantic@lemmy.ml
on 28 Feb 2025 21:01
collapse
Same with photoshop. I wish people didn’t do bad things with tools also, but we can’t let the worst of us all to define the world we live in.
coolmojo@lemmy.world
on 28 Feb 2025 23:54
collapse
Or cutting out faces with scissors and glueing them on playboy. Scissors and glue are just tools.
SabinStargem@lemmings.world
on 28 Feb 2025 04:38
nextcollapse
I am looking forward to this becoming common and effective. Being able to generate animated hentai in assorted styles would be neat. Lina Inverse getting boinked by Donald Duck could be a thing.
brucethemoose@lemmy.world
on 28 Feb 2025 23:01
collapse
Lina Inverse getting boinked by Donald Duck could be a thing.
You can 100% already do this, with high quality, locally. Generate an image with PonyXL (and Lora’s if it doesn’t get the characters quite perfect), do img2vid with Hunyuan or this.
The catch is that it’s all unreasonably user unfriendly to set up, and most online UIs are remarkably shit. Basically AI land just needs a few more “integrators” to plug this stuff into each other and package it sanely (as opposed to researchers just throwing papers out or “entrepeneurs” making shome shitty closed cloud UI), some better hardware support, and you’re already there with no new models/tech.
FreshLight@sh.itjust.works
on 28 Feb 2025 05:35
nextcollapse
Kudos to the English speaking guy on the team who pushed that under the radar. That has got to be intentional.
hmmm@sh.itjust.works
on 28 Feb 2025 08:05
nextcollapse
Is that JINX?
Rubanski@lemm.ee
on 28 Feb 2025 11:43
nextcollapse
Ye
Arda1@lemmy.world
on 28 Feb 2025 11:49
nextcollapse
I dont like that i was able to tell
whalebiologist@lemmy.world
on 28 Feb 2025 13:12
nextcollapse
fucking saw it too 😅
Zetta@mander.xyz
on 01 Mar 2025 13:40
nextcollapse
Yes, here’s the link to the uncensored clip, NSFW obviously, but you’ll need to put in an email to turn off NSFW filtering. civitai.com/images/60144368
The censored images in the thumbnail were all pulled from uploaded videos from this Lora on Civati called "Better Titfuck (WAN and HunYuan)"
Jimmycrackcrack@lemmy.ml
on 02 Mar 2025 02:07
collapse
I didn’t know who that was before looking it up and I totally thought you were taking about the Pokemon lol
My friend wants to know if it can generate VR videos too?
Lucky_777@lemmy.world
on 28 Feb 2025 13:27
nextcollapse
Asking the real questions
wabafee@lemmy.world
on 28 Feb 2025 16:53
nextcollapse
Tell your friend that my friend is also asking too.
brucethemoose@lemmy.world
on 28 Feb 2025 22:52
nextcollapse
Theoretically? Maybe.
There are already networks to generate depth maps out of 2D video, so hooking up the output to that and a simpler VR video encoder should probably work.
Will it be jank as hell? Oh yeah. Nothing is conveniently packaged in Local AI Land, unfortunately.
randon31415@lemmy.world
on 28 Feb 2025 23:37
collapse
Take anything created, ffmpeg it into frames, then run it through this script’s stereoscopic mode:
iAvicenna@lemmy.world
on 28 Feb 2025 11:37
nextcollapse
this is gonna take rule 34 to a whole another level
technohippie@slrpnk.net
on 28 Feb 2025 12:42
nextcollapse
Even Onlyfans girls and boys will lose their jobs because of AI :(
oldmansbeard@midwest.social
on 28 Feb 2025 13:21
nextcollapse
This is going to destroy what’s left of the social fabric
Regrettable_incident@lemmy.world
on 28 Feb 2025 18:37
collapse
You mean deep fakes? Yeah it’s really a not great development
burgerpocalyse@lemmy.world
on 28 Feb 2025 16:54
nextcollapse
ai porn is dull. ill take poorly drawn heinous fetish art over your ai generated slop.
Regrettable_incident@lemmy.world
on 28 Feb 2025 18:35
nextcollapse
Yeah, but I’m sure it’ll improve in time. I can’t really see what’s the point of AI porn though, unless it’s to make images of someone who doesn’t consent, which is shit. For everything else, regular porn has it covered and looks better too.
You want me to find a citation explaining why women selling their bodies is bad?
bigb@lemmy.world
on 28 Feb 2025 21:19
nextcollapse
Somehow you’re both partially wrong. There are people who have been badly abused by the porn industry. There’s always a need to make sure people are safe. But there’s nothing wrong if someone wants to willingly perform in pornography.
But it’s mostly because of you people. You make their lifes miserable by pointless moralisation. You are the reason the industry is full of shady monsters, you made it that way with your constant religious fever.
Do you need me to repeat it using simpler language?
VintageGenious@sh.itjust.works
on 02 Mar 2025 10:36
collapse
Yes explain how you turn one commenter into “you people” based on facts from your ass ?
Vatowine@lemmy.world
on 28 Feb 2025 22:21
collapse
I would argue in terms of being worn out (joints and stuff), construction is definitely harder on the body. How many regret that they can barely walk in their 50s 60s?
Why are you assuming that it is? Maybe it’s because I’m not a religious person, but I don’t see anything morally wrong with sex work. Whether someone is doing it against their will is a separate issue, but that’s not an assumption I’d make without other evidence.
If you really are coming at this issue from a religious point of view, then there’s no point getting into a discussion here since I’m not going to change your mind on that (nor do I care to; believe what you want). Otherwise, I’m curious what your actual arguments might be.
100_kg_90_de_belin@feddit.it
on 01 Mar 2025 14:43
collapse
Patriarchy destroys women’s lives. Porn is just having sex in front of a camera.
brucethemoose@lemmy.world
on 01 Mar 2025 01:10
collapse
So, I definitely didn’t click! But having clicked on GitHub links in the past I can surmise that there’s a step by step install guide and also one for model acquisition. Just be sure not to click the link, and definitely do not follow what I assume is a very well written and easily understood step by step install guide.
brucethemoose@lemmy.world
on 01 Mar 2025 01:09
collapse
This is Lemmy. Why not self host the generation? :)
OhVenus_Baby@lemmy.ml
on 28 Feb 2025 18:50
nextcollapse
Atleast don’t censor the photos so for educational purposes only we can collectively judge if this AI art is indeed worth a damn.
Graphine@lemmy.world
on 28 Feb 2025 20:53
nextcollapse
Finally, my dream of making black gay porn is realized
Hossenfeffer@feddit.uk
on 28 Feb 2025 23:12
nextcollapse
If you build it, they will porn.
diemartin@sh.itjust.works
on 01 Mar 2025 15:47
collapse
Why you think the net was born? 🎶
Noodle07@lemmy.world
on 01 Mar 2025 20:57
collapse
threaded - newest
Who are the girls in the picture? We can do this, team. Left to right, starting at the top.
Gwen Stacy
??
bayonet
little mermaid
??
??
Jinx
??
Rei
Rei
lol Rei
Aerith
Not to be that guy but… go out and touch some grass.
If you don’t want to be that guy, then stop being that guy.
You shouldn’t judge someone for having a grass fetish. They have wants and needs like anyone else.
When the lawn gets cut, which activates its distress signal with that sweet fresh fragrance… OP can’t help but get off on that.
I’ll never look at an old man with grass-stained, white New Balances ever again.
https://i.imgflip.com/3870n9.jpg
Because God forbid anyone have fun, right? You need to take your own advice.
You must be fun because you love pornstars! I don’t so I’m boring! I’m going for some fun grass now. Have a good one.
.
Ah yes, the famous pornstars “little mermaid” and “aerith”
I can’t disagree that they’re pretty famous. And apparently they are starring in a porn video. So…
We’re going to need to see some proof, smart guy. For uh discussions sake
Sounds like you need to touch yourself more, actually. It's okay. Everyone does it.
I usually don’t relay in pornstars, and if I do, I don’t repeat or remember their face. But I guess there is a big market of pro-fappers in Lemmy.
They’re videogame and movie characters, Einstein.
Oh, I’m sorry, I didn’t know. That perfectly proves my point that some people need to touch some grass, but anyway, I’m not here to judge anyone’s life, it was just a little piece of advice. That’s all for me. Have a good one.
Let me guess, your parents don’t let you watch movies or play video games so you have anger issues towards adults that have done so their entire lives. That’s too bad.
Oh, you wanna talk about me? My parents bought a chipped PS1 when I was a kid and love it. Now I have a PS5 (used to play AC, GTA, and FIFA, but I got bored, so it became a “youtube frontend”). I had a cable TV and a PC with Windows 95/98/XP/Vista (acording time) and later Ubuntu. Great childhood, but I also had friends, used to play football every single week day (the one with the foot in the ball). Now I only play once a week in my local team (But I do it really bad, so I play defense). Anything more you wanna know? I’m all for sharing! =) I know, i know, too much about me. Have a great day. And next time just ignore or take the advice. If you feel I’m trolling, just don’t feed me. Bye!!
Oof
Based on their response habits, it’s likely a poorly made AI or a 13 year old kid. Not worth interacting with it cuz either: it is incable or caring, or they really aren’t supposed to be here and we really shouldn’t welcome children into adult spaces by allowing them into the conversation. If they don’t wanna have discussion, then why would they contribute to conversation when spoken to?
The advice was “touch” not “smoke” but at least you were out for some time.
.
.
take a chill pill
He can’t. His parents bought him a PS 1 when he was a child, or something.
Are you still talking about me? Oh man… so sad.
That’s Rem, not Rei
Who’s Rem?
Left of rei. Likely her twin sister Ram is to the left of her, but iirc they have different color hairbands so it’s probably just two copies of Rem, so like [Rem, Rem, Rei]
Character from an anime called Re:Zero
Edit: I guess I got whooshed? I haven’t seen the show yet, just know what some characters look like.
That was so high level it went past most of them. Take my upvote
If you have to ask, you can’t afford it!
Dude, I’ve got like… Five bucks. How much more do these cartoon woman want?
7 looks like Jinx from Arcane
Added
It feels good to contribute to society
#1 is markiplier
To which I have to say… good on them for using AI porn in the least bad way? (IE realistic fictional characters instead of real people that did not consent to the depictions being made of them).
That’s mostly what AI Porn land is now, as animation is easier and hence fictional characters are easier. Trainers also tend to exclude real people from base models.
#1 Gwen Stacy?
Added
I would find it hard to believe if Jessica Rabbit wasn’t somewhere there, probably 6? Although the hair style doesn’t quite match.
The world has forgotten about Mrs Rabbit, sadly.
I am going to make this statement openly on the Internet. Feel free to make AI generated porn of me as long as it involves adults. Nobody is going to believe that a video of me getting railed by a pink wolf furry is real. Everyone knows I’m not that lucky.
Yeah after certain point this shit doesn't really do anything... But younger people and mentally vulnerable people gonna be rocked by this tech lol
I’d like to point out the program is named WanX…wanx…wanks…
Does the wolf need to be an adult in human years or dog years?
I understand the convention is that the wolf is really a thousand-year-old polymorphed dragon, regardless of physical appearance.
Drag is finally vindicated?
Well that is a plot twist that I was somehow not expecting from an AI porn video.
Asking the important questions…
i think that it should probably be capable of consent, that would be my guess.
I mean, I think he said it’s a pink wolf furry, so you’re probably good if he’s the one penetrating. If it was a pink furry wolf, on the other hand…
i believe that is what they said, that is a pretty good chance, but still good to be on the safe side i suppose.
Fortunately, most of my family is so tech illiterate that even if a real video got out, I could just tell them it’s a deepfake and they’d probably believe me.
Nice
Ok I checked it out, why are there so many people with boobs and with 3 foot cocks? Talk about unrealistic body expectations.
Well, genetic engineering is advancing at a good clip.
I hate how mainstream media is always underrepresenting the size of the typical cock.
I have a 3 foot cock. Well as long as they’re mouse feet.
<img alt="" src="https://lemmy.ml/pictrs/image/333417d6-080f-4442-bed1-2bc9731b9726.jpeg">
Check out this article en.wikipedia.org/wiki/Supernormal_stimulus
First off, I am sex positive, pro porn, pro sex work, and don’t believe sex work should be shameful, and that there is nothing wrong about buying intimacy from a willing seller.
That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.
I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.
I’ve been thinking about this recently too, and I have similar feelings.
I’m just gonna come out and say it without beating around the bush: what is the law’s position on AI-generated child porn?
More importantly, what should it be?
It goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn’t?
If we’re basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour (i.e. increasing demand), or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour (I.e. lowering demand).
And to know that, we’d need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don’t know whether it would or won’t), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody’s touching that with a ten foot pole.
It’s so much simpler than that—it can be created now, so it will be. They will use narrative twists to post it on the clearnet, just like they do with anime (she’s really a 1000 year old vampire, etc.). Creating laws to allow it are simply setting the rules of the phenomenon that is already going to be happening.
The only question is whether or not politicians will stop mud slinging long enough to have an adult conversation, or will we just shove everything into the more obscure parts of the internet and let it police itself.
No adult conversation required, just a quick “looks like we don’t get internet privacy after all everyone.” And erosion of more civil liberties. Again.
Let’s play devils advocate. You find Bob the pedophile with pictures depicting horrible things. 2 things are true.
Although you can’t necessarily help Bob you can lock him up preventing him from doing harm and permanently brand him as a dangerous person making it less likely for actual children to be harmed.
Bob can’t claim actual depictions of abuse are AI generated and force you to find the unknown victim before you can lock him and his confederates up. If the law doesn’t distinguish between simulated and actual abuse then in both cases Bob just goes to jail.
A third factor is that this technology and the inherent lack of privacy on the internet could potentially pinpoint numerous unknown pedophiles who can even if they haven’t done any harm yet be profitably persecuted to societies ultimate profit so long as you value innocent kids more than perverts.
Am I reading this right? You’re for prosecuting people who have broken no laws?
I’ll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?
This sounds like some Minority Report hellscape society.
Correct. This quickly approaches thought crime.
What about an AI gen of a violent rape and murder. Shouldn’t that also be illegal.
But we have movies that have protected that sort of thing for years; graphically. Do those the become illegal after the fact?
And we also have movies of children being victimized so do these likewise become illegal?
We already have studies that show watching violence does not make one violent and while some refuse to accept that, it is well established science.
There is no reason to believe the same isn’t true for watching sexual assault. There are been many many movies that contain such scenes.
But ultimately the issue will become that there is no way to prevent it. The hardware to generate this stuff is already in our pockets. It may not be efficient but it’s possible and efficiency will increase.
The prompts to generate this stuff are easily shared and there is no way to stop that without monitoring all communication and even then I’m sure work around would occur.
Prohibition requires society sacrifice freedoms and we have to decide what weee willing to sacrifice here because as we’ve seen with or prohibitions, once we unleash the law on one, it can be impossible to undo.
Ok watch adult porn then watch a movie in which women or children are abused. Note how the abuse is in no way sexualized exactly opposite of porn. It often likely takes place off screen and when rape in general appears on screen between zero and no nudity co-occurs. For children it basically always happens off screen.
Simulated child abuse has been federally illegal for ~20 years in the US and we appear to have very little trouble telling the difference between prosecuting pedos and cinema even whilst we have struggled enough with sexuality in general.
This argument works well enough for actual child porn. We certainly don’t catch it all but every prosecution takes one more pedo off the streets. The net effect is positive. We don’t catch most car thieves either and nobody suggests we legalize car theft.
No I’m for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal. Notably you are free to fantasize about whatever you like its the actual creation and sharing of images that would be illegal. Far from being a minority report hellscape its literally the present way things already are many places.
Lol, how can you say that do confidently? How would you know that with fewer AI CP you get less abused kids? And what is the logic behind it?
Demand doesn’t really drop if something is illegal (same goes for drugs). The only thing you reduce is offering, which just resulting in making the thing that got illegal more valuable (this wakes attention of shady money grabbers that hate regulation / give a shit about law enforcement and therefore do illegal stuff to get money) and that you have to pay a shitton of government money maintaining all the prisons.
Basically every pedo in prison is one who isn’t abusing kids. Every pedo on a list is one who won’t be left alone with a young family member. Actually reducing AI CP doesn’t actually by itself do anything.
Wrong. Every pedo in prison is one WHO HAS ALREADY ABUSED A CHILD, whether directly or indirectly. There is an argument to be made, and some studies that show, that dealing with Minor Attracted People before they cross the line can be effective. Unfortunately, to do this we need to be able to have a logical and civil conversation about the topic, and the current political climate does not allow for that conversation to be had. The consequence is that preventable crimes are not being prevented, and more children are suffering for it in the long run.
People are locked up all the time for just possessing child porn without having abused anyone. This isn’t a bad thing because they are a danger to society.
No, they are not locked up because they’re a danger to society. They’re locked up because possessing CP is indirectly contributing to the abuse of the child involved.
Good arguments. I think I am convinced that both cases should be illegal.
If the pictures are real they probably increase demand, which is harmful. If the person knew, then the action therefore should result in jail and forced therapy.
If the pictures are not, forced therapy is probably the best option.
So I guess it being illegal in most cases simply to force therapy is the way to go. Even if it in one case is “victimless”. If they don’t manage to plausibly seem rehabilitated by professionals, then jail time for them.
I would assume (but don’t really know) most pedophiles don’t truly want to act on it, and don’t want to have those urges. And would voluntarily go to therapy.
Which is why I am convinced prevention is the way to go. Not sacrificing privacy. In Norway we have anonymous ways for pedophiles to seek help. There have been posters and ads for it a lot of places a year back or something. I have not researched how it works in practice though.
Edit: I don’t think the therapy we have in Norway is conversion therapy. It’s about minimizing risk and helping deal with the underlying causes, medication, childhood trauma etc. I am not necessarily convinced that conversion therapy works.
Therapy is well and good and I think we need far more avenues available for people to get help (for all issues). That said, sexuality and attraction are complicated.
Let me start by saying I am not trying to state there is a 1:1 equivalence, this is just a comparison, but we have long abandoned conversion therapy for homosexuals, because we’ve found these preferences are core to them and not easily overwritten. The same is true for me as a straight person, I don’t think therapy would help me find men attractive. I have to imagine the same is true for pedophiles.
The question is, if AI can produce pornography that can satisfy the urges of someone with pedophilia without harming any minors, is that a net positive? Remember the attraction is not the crime, it’s the actions that harm others that are. Therapy should always be on the table.
This is a tricky subject because we don’t want to become thought police, so all our laws are built in that manner. However there are big exceptions for sexual crimes due to the gravity of their impact on society. It’s very hard to “stand up” for pedophilia because if acted upon it has monstrous effects, but AI is making us open this can of worms that I don’t belive we ever really thought through besides criminalizing and demonizing (which could be argued was the correct approach with the technology at the time).
I really don’t know enough about the subject or how that therapy works. I doubt that it is conversion therapy, but Ii really don’t know. I would assume it’s handling childhood trauma, medications etc.
Both therapy and if satisfying urges through AI generated content, is both something that should be answered scientifically. If there is research then that should be the basis for what decisions is taken, if there is a lack of research then more research should be the next step.
There’s nothing particularly complicated about it not being ok to rape kids, or to distribute depictions of kids being raped.
Nobody here has at all suggested it’s ok to rape kids. I hope you can understand the difference between thinking something and doing something.
This is true, but it really depends how “therapy” is defined. And forced therapy could mean anything from things like the old conversion therapy approach for gay people.
You might argue that these days we wouldn’t do anything so barbaric, but considering that the nature of a pedophile’s is very unsavory, and the fact that it will never be acceptable, unlike homosexuality, people would be far more willing to abuse or exploit said “therapy”
the simplest possible explanation here, is that any porn created based on images of children, is de facto illegal. If it’s trained on adults explicitly, and you prompt it for child porn, that’s a grey area, probably going to follow precedent for drawn art, rather than real content.
Already illegal here in the UK metro.co.uk/…/makers-ai-child-abuse-images-jailed…
Illegal is most of the west already as creating sexual assault material of minors is already illegal regardless of method.
Pretend underage porn is illegal in the EU and some other countries. I believe, in the US it is protected by the first amendment.
Mind that when people talk about child porn or CSAM that means anything underage, as far as politics is concerned. When two 17-year-olds exchange nude selfies, that is child porn. There were some publicized cases of teens in the US being convicted as pedophile sex offenders for sexting.
CSAM, artificial or not, is illegal in the United States.
justice.gov/…/man-arrested-producing-distributing…
I see. I’ve looked up the details. Obscenity - whatever that means - is not protected by the first amendment. So where the material is obscene, it is still illegal.
en.wikipedia.org/…/Ashcroft_v._Free_Speech_Coalit…
I think the concern is that although it’s victimless, if it’s legal it could… Normalise (within certain circles) the practice. This might make the users more confident to do something that does create a victim.
Additionally, how do you tell if it’s really or generated? If AI does get better, how do you tell?
It was trained on something.
It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.
However the act of generating something does not create a new victim unless there is someones likeness and it is shared? Or is there something ethical here, that I am missing?
(Yes, all current AI is basically collective piracy of everyones IP, but besides that)
Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.
So take that video and modify it a bit. Color correct or something. That’s still abuse, right?
So the question is, at what point in modifying the video does it become not abuse? When you can’t recognize the person? But I think simply blurring the face wouldn’t suffice. So when?
That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?
I can’t make that call. And because I can’t make that call, I can’t support the concept.
I mean, there’s another side to this.
Assume you have exacting control of training data. You give it consensual sexual play, including rough play, bdsm play, and cnc play. We are 100% certain the content is consensual in this hypothetical.
Is the output a grey area, even if it seems like real rape?
Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?
Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?
We can build on that further. What if they take the time to animate this scene? Is that a grey area?
When does the above cross into a problem? Is it the AI making something that seems like rape but is built on consensual content? The thought of a person imagining a real rape? The putting of that thought onto a still image? The animating?
Or is it none of them?
on a base semantic and mechanic level, no, not at all. They aren’t real people, there aren’t any victims involved, and there aren’t any perpetrators. You might even be able to argue the opposite, that this is actually a net positive, because it prevents people from consuming real abuse.
until you can either publicly display yours, or someone else process of thought, or read peoples minds, definitionally, this is an impossible question to answer. So the default is no, because it’s not possible to be based in any frame of reality.
assuming it depicts no real persons or identities, no, there is nothing necessarily wrong about this, in fact i would defer back to the first answer for this one.
this is the same as the previous question, media format makes no difference, it’s telling the same story.
most people would argue, and i think precedent would probably agree, that this would start to be a problem when explicit external influences are a part of the motivation, rather than an explicitly internally motivated process. There is necessarily a morality line that must be crossed to become a more negative thing, than it is a positive thing. The question is how to define that line in regards to AI.
We already allow simulated rape in tv and movies. AI simply allows a more graphical portrayal.
Consensual training data makes it ok. I think AI companies should be accountable for curating inputs.
Any art is ok as long as the artist consents. Even if they’re drawing horrible things, it’s just a drawing.
Now the real question is, should we include rapes of people who have died and have no family? Because then you can’t even argue increased suffering of the victim.
But maybe this just gets solved by curation and the “don’t be a dick” rule. Because the above sounds kinda dickish.
I see the issue with how much of a crime is enough for it to be okay, and the gray area. I can’t make that call either, but I kinda disagree with the black and white conclusion. I don’t need something to be perfectly ethical, few things are. I do however want to act in a ethical manner, and strive to be better.
Where do you draw the line? It sounds like you mean no AI can be used in any cases, unless all the material has been carefully vetted?
I highly doubt there isn’t illegal content in most AI models of any size by big tech.
I am not sure where I draw the line, but I do want to use AI services, but not for porn though.
It just means I don’t use AI to create porn. I figure that’s as good as it gets.
It’s not just AI that can create content like that though. 3d artists have been making victimless rape slop of your vidya waifu for well over a decade now.
Yeah, I’m ok with that.
AI doesn’t create, it modifies. You might argue that humans are the same, but I think that’d be a dismal view of human creativity. But then we’re getting weirdly philosophical.
is this a legal thing? I’m not familiar with the laws surrounding sexual abuse, on account of the fact that i don’t frequently sexually abuse people, but if this is an established legal precedent that’s definitely a good argument to use.
However, on a mechanical level. A recounting of an instance isn’t necessarily a 1:1 retelling of that instance. A video of rape for example, isn’t abuse anymore so than the act of rape within it, and of course the nonconsensual recording and sharing of it (because it’s rape) distribution of that could necessarily be considered a crime of it’s own, same with possession, however interacting with the video i’m not sure is necessarily abuse in it’s own right, based on semantics. The video most certainly contains abuse, the watcher of the video may or may not like that, i’m not sure whether or that should influence that, because that’s an external value. Something like “X person thought about raping Y person, and got off to it” would also be abuse under the same pretense at a certain point. There is certainly some interesting nuance here.
If i watch someone murder someone else, at what point do i become an accomplice to murder, rather than an additional victim in the chain. That’s the sort of litmus test this is going to require.
to be clear, this would be a statistically minimal amount of abuse, the vast majority of adult content is going to be legally produced and sanctioned, made public by the creators of those videos for the purposes of generating revenue. I guess the real question here, is what percent of X is still considered to be “original” enough to count as the same thing.
Like we’re talking probably less than 1% of all public porn, but a significant margin, is non consensual (we will use this as the base) and the AI is trained on this set, to produce a minimally alike, or entirely distinct image from the feature set provided. So you could theoretically create a formula to determine how far removed you are from the original content in 1% of cases. I would imagine this is going to be a lot closer to 0 than it is to any significant number, unless you start including external factors, like intentionally deepfaking someone into it for example. That would be my primary concern.
another important concept here is human behavior as it’s conceptually similar in concept to the AI in question, there are clear strict laws regarding most of these things in real life, but we aren’t talking about real life. What if i had someone in my family, who got raped at some point in their life, and this has happened to several other members of my family, or friends of mine, and i decide to write a book, loosely based on the experiences of these individuals (this isn’t necessarily going to be based on those instances for example, however it will most certainly be influenced by them)
There’s a hugely complex hugely messy set of questions, and answers that need to be given about this. A lot of people are operating on a set of principles much too simple to be able to make any conclusive judgement about this sort of thing. Which is why this kind of discussion is ultimately important.
With this logic, any output of any pic gen AI is abuse… I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.
We could be sure of it if AI curated it’s inputs, which really isn’t too much to ask.
Well AI is by design not able to curate its training data, but companies training the models would in theory be able to. But it is not feasible to sanitise this huge stack of data.
Yes?
There is no ethical consumption while living a capitalist way of life.
ML always there to say irrelevant things
😆as if this has something to do with that
But to your argument: It is perfectly possible to tune capitalism using laws to get veerry social.
I mean every “actually existing communist country” is in its core still a capitalist system, or how you argue against that?
the downlow of it is quite simple, if the content is public, available for anybody to consume, and copyright permits it (i don’t see why it shouldn’t in most cases, although if you make porn for money, you probably hold exclusive rights to it, and you probably have a decent position to begin from, though a lengthy uphill battle nonetheless.) there’s not really an argument against that. The biggest problem is identity theft and impersonation, more so than stealing work.
yeah bro wait until you discover where neural networks got that idea from
i have no problem with ai porn assuming it’s not based on any real identities, i think that should be considered identity theft or impersonation or something.
Outside of that, it’s more complicated, but i don’t think it’s a net negative, people will still thrive in the porn industry, it’s been around since it’s been possible, i don’t see why it wouldn’t continue.
Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?
revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.
To be clear, you’re example is a sketch of johnny depp, i’m talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.
Again you’re talking about distribution
sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.
Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That’s considered identity theft/fraud when we do it with legally identifying papers, it’s a similar case here i think.
But the thing is it’s not a relevant law here at all as nothing is being distributed and no one is being harmed. Would you say the same thing if AI is not involved? Sure it can be creepy and weird and whatnot but it’s not inhertly harmful or at least it’s not obvious how it would be.
the only perceivable reason to create these videos is either for private consumption, in which case, who gives a fuck. Or for public distribution, otherwise you wouldn’t create them. And you’d have to be a bit of a weird breed to create AI porn of specific people for private consumption.
If AI isn’t involved, the same general principles would apply, except it might include more people now.
I’ve been thinking about this more and I think one interesting argument here is “toxic culture growth”. As in even if the thing is not distributed it might grow undesired toxic cultures througu indirect exposures (like social media or forums discussions) even without the direct sharing.
I think this is slippery to the point of government mind control but maybe there’s something valuable to research here either way.
this is another big potential as well. Does it perpetuate cultural behaviors that you want to see in society at large? Similar things like this have resulted from misogyny and the relevant history of feminism.
It’s a whole thing.
i think there is probably a level of government regulation that is productive, i’m just curious about how we even define where that line starts and ends, because there is not really an explicitly clear point, unless you have solid external inferences to start from.
Honestly I’m quite happy with “social justice warrior” approach. Sure it’s flawed and weak to manipulation for now but as a strategy for society to self correct its quite brilliant.
I’m optimistic society itself should be able to correct itself for this issue as well though considering the current climate the correction might be very chaotic.
i mean, i’m not sure modern social justice is working as intended given the political landscape, but historically small communities do manage to self regulate very effectively, that one is for sure. I will give you that.
The only effective way to mandate something at a societal level is going to be laws, i.e. government, otherwise you’re going to have an extremely disjointed and culturally diverse society, which isn’t necessarily a bad thing.
I guess the point is this enables the mass production of revenge porn essentially at a person on the street level which makes it much harder to punish and prevent distribution. when it is relatively few sources that produces the unwanted product then only punishing the distribution might be a viable method. But when the production method becomes available to the masses then the only feasible control mechanism is to try to regulate the production method. It is all a matter of where is the most efficient position to put the bottle neck.
For instance when 3D printing allows people to produce automatic rifles in their homes “saying civil use of automatic rifles is illegal so that is fine” is useless.
I think that’s a fair point and I wonder how will this effect the freedom of expression on the internet. If you can’t find the distributor then it’ll be really tough to get a handle of this.
On the other hand the sheer over abundance could simply break the entire value of revenge porn as in “nothing is real anyway so it doesn’t matter” sort of thing which I hope would be the case. No one will be watching revenge porn cause they generate any porn they want in a heartbeat. Thats the ideal scenario anyway.
It is indeed a complicated problem with many intertwined variables, wouldn’t wanna be in the shoes of policy makers (assuming that they actually are searching for an honest solution and not trying to turn this into profit lol).
For instance too much regulation on fields like this essentially would kill high quality open source AI tools and make most of them proprietary software leaving the field in the mercy of tech monopolies. This is probably what these monopolies want and they will surely try to push things this way to kill competition (talk about capitalism spurring competition and innovation!). They might even don the cloak of some of these bad actors to speed up the process. Given the possible application range of AI, this is probably even more dangerous than flooding the internet with revenge porn.
%100 freedom, no regulations will essentially lead to a mixed situation of creative and possibly ground breaking uses of the tech vs many bad actors using the tech for things like scamming, disinformation etc. how it will balance out on the long run is probably very hard to predict.
I think two things are clear, 1-both extremities are not ideal, 2- between the two extremities %100 freedom is still the better option (the former just exchanges many small bad actors for a couple giant bad actors and chokes any possible good outcomes).
Based on these starting with a solution closer to the “freedom edge” and improving it step by step based on results is probably the most sensible approach.
<img alt="" src="https://lemmy.world/pictrs/image/aa0708c2-87b0-40b5-82f4-9d429b84626f.gif">
With any model in use, currently, that is impossible to meet. All models are trained on real images.
yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?
You are literally using the schizo argument right now. “If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness”
No, the problem is a lack of consent of the person being used.
And now, being used to generate depictions of rape and CSAM.
yeah but like, legally, is this even a valid argument? Sure there is techically probably like 0.0001% of the average person being used in any given result of an AI generated image. I don’t think that gives anyone explicit rights to that portion however.
That’s like arguing that a photographer who captured you in a random photo in public that became super famous is now required to pay you royalties for being in that image, even though you are literally just a random fucking person.
You can argue about consent all you want, but at the end of the day if you’re posting images of yourself online, you are consenting to other people looking at them, at a minimum. Arguably implicitly consenting to other people being able to use those images. (because you can’t stop people from doing that, except for copyright, but that’s not very strict in most cases)
i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim, otherwise it’s no different than me editing an image of someone to make it look like they got shot in the face. Is that shitty? Sure. But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.
The fundamental problem here is that you’re in an extremely uphill position to even begin the argument of “well it’s trained on people so therefore it uses the likeness of those people”
Does a facial structure recognition model use the likeness of other people? Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.
Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.
Morally, that’s what you’re doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.
It makes them a victim.
The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.
Yes.
Exactly. So, without consent, it shouldn’t be used. Periodt.
if you have schizoprehnia, sure. Legal is what the law defines as ok. Whether or not people get charged for it is another thing. The question is “do you have the legal right to do it or not”
legally, the reasoning behind this is because it’s just extremely illegal, there are almost no if not zero, where it would be ok or reasonable, and therefore the moral framework tends to be developed around that. I don’t necessarily agree with it always being victimization, because there are select instances where it just doesn’t really make sense to consider it that, there are acts you commit that would make it victimization. However i like to subscribe to the philosophy that it is “abusive” material, and therefore innately wrong. Like blackmail, i find that to be a little bit more strict and conducive to that sort of definition.
at one point in time yes, perpetually in some capacity, they will exist as having been a victim, or having been victimized at one point. I also don’t really consider it to be healthy or productive to engage in “once a victim always a victim” mentality, because i think it sets a questionable precedent for mental health care. Semantically someone who was a victim once, is still a victim of that specific event, however it’s a temporally relevant victimization, i just think people are getting a little loose on the usage of that word recently.
I’m still not sure how it makes that person a victim, unless it meets one of the described criteria i laid out, in which case it very explicitly becomes an abusive work. Otherwise it’s questionable how you would even attribute victimization to the victim in question, because there is no explicit victim to even consider. I guess you could consider everybody even remotely tangentially relevant to be a victim, but that then opens a massive blackhole of logical reasoning which can’t trivially be closed.
To propose a hypothetical here. Let’s say there is a person who we will call bob. Bob has created a depiction of “abuse” in such a horrendous manner that even laying your eyes upon such a work will forever ruin you. We will define the work in question to be a piece of art, depicting no person in particular, arguably barely resembling a person at all, however the specific definition remains to the reader. You could hypothetically in this instance argue that even viewing the work is capable of making people a “victim” to it. However you want to work that one out.
The problem here, is that bob hasn’t created this work in complete isolation, because he’s just a person, he interacts with people, has a family, has friends, acquaintances, he’s a normal person, aside from the horrors beyond human comprehension he has created. Therefore, in some capacity the influence of these people in his life, has to have influenced the work he engaged in on that piece. Are the people who know/knew bob, victims of this work as well, regardless of whether or not they have seen it, does the very act of being socially related to bob make them a victim of the work? For the purposes of the hypothetical we’ll assume they haven’t seen the work, and that he has only shown it to people he doesn’t personally know.
I would argue, and i think most people would agree with me, that there is no explicit tie in between the work that bob has created, and the people he knows personally. Therefore, it would be a stretch to argue that because those people were tangentially relevant to bob, are now victims, even though they have not been influenced by it. Could it influence them in some external way, possibly causing some sort of external reaction? Yeah, that’s a different story. We’re not worried about that.
This is essentially the problem we have with AI, there is no explicit resemblance to any given person (unless defined, which i have already explicitly opted out of) or it has inherently based the image off of via training (which i have also somewhat, explicitly opted out of as well) there are two fundamental problems here that need to be answered. First of all, how are these people being victimized? By posting images publicly on the internet? Seems like they consented to people at least being aware of them, if not to some degree manipulating images of them, because there is nothing to stop that from happening (as everyone already knows from NFTs) And second of all, how are we defining these victims? What’s the mechanism we use to determine the identity of these people, otherw
I refuse to debate ideas on how to make ethical CSAM with you.
Go find a pedo to figure out how you want to try to make CSAM, and you can well akshully all tou want.
all of my arguments have explicitly removed any form of anything closely resembling CSAM to the point of being illegal under existing law, or at the very least, extremely questionable.
The only thing i haven’t excluded is the potential to use an AI trained explicitly on humans, with no children being in the dataset, being used to generate porn of someone “under the age of 18” which it has zero basis of reality on, and cannot functionally do. That would be the only actual argument i can think of where that wouldn’t already be illegal, or at least extremely questionable. Everything else i’ve provided a sufficient exclusion for.
Have fun calling me a pedo for no reason though, i guess.
thispersondoesnotexist.com
Refresh for a new fake person
this ones a classic.
I’ve found that there’s a lot of things on the Internet that went wrong because it was ad supported for “free”. Porn is one of them.
There is ethically produced porn out there, but you’re going to have to pay for it. Incidentally, it also tends to be better porn overall. The versions of the videos they put up on tube sites are usually cut down, and are only part of their complete library. Up through 2012 or so, the tube sites were mostly pirated content, but then they came to an agreement with the mainstream porn industry. Now it’s mostly the studios putting up their own content (plus independent, verified creators), and anything pirated gets taken down fast.
Anyway, sites like Crash Pad Series, Erika Lust, Vanessa Cliff, and Adulttime (the most mainstream of this list) are worth a subscription fee.
Whats illegal in real porn should be illegal in AI porn, since eventually we won’t know whether it’s AI
That’s the same as saying we shouldn’t be able to make videos with murder in them because there is no way to tell if they’re real or not.
That’s a good point, but there’s much less of a market for murder video industry
I mean, a lot of TV has murders in it. There is a huge market for showing realistic murder.
But I get the feeling your saying that there isn’t a huge market for showing real people dying realistically without their permission. But that’s more a technicality. The question is, is the content or the production of the content illegal. If it’s not a real person, who is the victim of the crime.
Yeah the latter. Also murder in films for the most part is for storytelling. It’s not murder simulations for serial killers to get off to, you know what I mean?
You are wrong.
AI media models has to be trained on real media. The illegal content would mean illegal media and benefiting ,supporting, & profiting from and to victims of crime.
The lengths and fallacies pedophiles will go to justify themselves is absurd.
Excuse me? I am very offended by your insinuations here. It honestly makes me not want to share my thought and opinions at all. I am not in any way interested in this kind of content.
I encourage you to read my other posts in the different threads here and see. I am not an apologist, and do not condone it either.
I do genuinely believe AI can generate content it is not trained on, that’s why I claimed it can generate illegal content without a victim. Because it can combine stuff from things it is trained on and end up with something original.
I am interested in learning and discussing the consequences of an emerging and novel technology on society. This is a part of that. Even if it is uncomfortable to discuss.
You made me wish I didn’t…
Don’t pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works.
You are right it is a victimless crime (for the creation of content). I could create porn with minions without using real minion porn to put the randomnest example I could think of. There’s the whole defamation thing of publishing content without someone’s permission but that I feel is a discussion irrelevant of AI (we could already create nasty images of someone before AI, AI just makes it easier). But using such content for personal use… It is victimless. I have a hard time thinking against it. Would availability of AI created content with unethical themes allow people to get that out of their system without creating victims? Would that make the far riskier and horrible business of creating illegal content with real unwilful people disappear? Or at the very least much more uncommon? Or would make people more willing to consume thw content creating a feelibg of fake safety towards content previously illegal? There’s a lot of implications that we should really be thinking about and how it would affect society, for better or worse…
Thank you 😊
It’s time like this that I think, ahhh, the Internet never truly changed :D
This is the way.
Pics or it didn’t happen.
<img alt="" src="https://lemmy.ca/pictrs/image/69bc4e9c-f80a-4d14-8546-d379d5ce835a.png">
😂
I feel like this should be a real tattoo….
The year is 1997. A young boy is about to watch a porn video for the first time on a grainy VHS tape. An older version of himself from the far off year of 2025 appears.
Me: “You know, in the future, you’ll make your own porn videos.”
90s me: “Wow, you mean I’ll get to have lots of hot sex!?!?”
Me: “Ha! No. So Nvidia will release this system called CUDA…”
I thought this was going to go Watchmen for a moment. Like…
It is 1997, I am a young boy, I am jerking off to a grainy porno playing over stolen cinemax.
It is 2007, i am in my dorm, i am jerking off to a forum thread full of hi-res porno.
It is 2027, i am jerking off to an ai porno stream that mutates to my desires in real time. I am about to nut so hard that it shatters my perception of time.
Oops the stream hallucinated and mutated into a horror show with women with nipples that are mouths and dicks with eyeballs.
Did I stutter?
Go on…
“Nothing else excites me anymore” - 2027 you, probably
I cannot for the life of me find it, but there was an article before the modern explosion of ML into the public consciousness where a guy found that one of the big search engines offered an API to “rate” nsfw images on a numeric scale. The idea was that site admins could use it to automatically filter content with a bit more granularity.
So naturally, he trained a naive image generating model with the nsfw score as the sole metric of success, and got results much like you describe. Results were pretty much “what if Dali and kronenberg had a baby with slaanesh?”
Where were you in 2017?
Went to college
I figured rule of threes meant it was funnier to leave it out. 2017 would have been sad gooning to pornhub during the first trump nightmare.
Then 2027 could be sad gooning to ai hyperporn during the second trump nightmare.
Maybe I should have used 20 year jumps, but "2037, I am jerking off because there’s no food, and the internet is nothing but ai porn.’ didn’t seem as funny a point for the “time shattering” bit.
Look it was a long thread okay.
The AI camera will still zoom in on the guys nuts as you’re about to bust tho.
Why ask it for gay porn if it doesn’t?
You nut so hard in 2027 it teleports you back to 1997?
Then another company called Deepseek will release a system called low level programming that replaces CUDA.
It’s really called Wanx?
Oh my God! That’s disgusting! AI porn online!? Where!? Where do they post those!? There’s so many of them, though. Which one?
Fucking disgusting unethical shit. Tell me where to find it so I can avoid it pls.
.
pop is funny
I’m a representative of the American pornographical societies of America and I would like to educate myself and others in this disgusting AI de-generated graphical content for the purpose of informing my friends and family about this. Could we please have a link?
You need an account and turn off the nsfw filter.
*Aliboobie
Good.
Hot take but I think AI porn will be revolutionary and mostly in a good way. Sex industry is extremely wasteful and inefficient use of our collective time that also often involves a lot of abuse and dark business practices. It’s just somehow taboo to even mention this.
Sometimes you come across a video and you are like ‘oh this. I need more of THIS.’
And then you start tailoring searches to try find more of the same but you keep getting generic or repeated results because the lack of well described/defined content overuse of video TAGs (obviously to try get more views with a large net rather than being specific).
But would i watch AI content if i could feed it a description of what i want? Hell yeah!
I mean there are only so many videos of girls giving a blowjob while he eats tacos and watches old Black & White documentaries about the advancements of mechanical production processes.
I hate it when you find a good video, it does the job really well, so a few days later you’re like, yeah let’s watch that one again, you type in every single description you can about the video and find everything else but the video you want and they’re barely anywhere near as good.
Hell I’d take an AI porn search engine for now, let me describe in detail what the video I’m looking for is so I can finally see it again.
I usually just download them for this reason.
Always, always download your favourite videos. Today they’re available, tomorrow you don’t know. (Remember the great PornHub purge? Pepperidge Farms remember)
Not to mention the users that may have a specific interest in some topic/action and basically all types of potential sources are locked bwhind paywalls.
Can confirm with my fetish. Some great artists and live actors who do it, but 90% of the content for it online is bad MS Paint level edits and horrid acting with props. That 10%? God tier, the community showers them in praise and commissions and only stop when they want to, unless a payment service like Visa or Patreon censors them and their livelihood as consenting adults.
If only there was a p2p way to send people funds without anyone knowing the sender or the recipient…
Yes, but will also hurt the very workers’ bottom line, and with some clever workarounds, it’ll be used to fabricate defamatory material other methods were not good at.
Good, the sooner this system fails to provide for more people the sooner we form mobs and rid ourselvs of it.
The human race needs to progress to our next evolutionary step to a post scarcity approach to world economies
AI won’t bring us a post scarcity world, but one with the upmost surveillance and “art” no longer made by artists.
We already live in a post scarcity world, we produce more than enough needs and goods for every person alive, we just throw away more than half of all food and clothing produced instead of giving it to the hungry because it’s isn’t profitable.
*utmost
I can see it being used to make images of people who don’t know and haven’t consented though and that’s pretty shit
And clearly will be used for CP.
Filters on online platforms are surprisingly effective, though. CivitAI is begging for it, and they’ve kept it out.
Same with photoshop. I wish people didn’t do bad things with tools also, but we can’t let the worst of us all to define the world we live in.
Or cutting out faces with scissors and glueing them on playboy. Scissors and glue are just tools.
I am looking forward to this becoming common and effective. Being able to generate animated hentai in assorted styles would be neat. Lina Inverse getting boinked by Donald Duck could be a thing.
You can 100% already do this, with high quality, locally. Generate an image with PonyXL (and Lora’s if it doesn’t get the characters quite perfect), do img2vid with Hunyuan or this.
The catch is that it’s all unreasonably user unfriendly to set up, and most online UIs are remarkably shit. Basically AI land just needs a few more “integrators” to plug this stuff into each other and package it sanely (as opposed to researchers just throwing papers out or “entrepeneurs” making shome shitty closed cloud UI), some better hardware support, and you’re already there with no new models/tech.
<img alt="" src="https://sh.itjust.works/pictrs/image/cd995297-b402-4773-a3c4-607486143303.png">
It’s super funny to me that they named it “WanX” haha
it fulfilled its name mandate %100
Hmm, it’s certainly blowing something.
Kudos to the English speaking guy on the team who pushed that under the radar. That has got to be intentional.
Is that JINX?
Ye
I dont like that i was able to tell
fucking saw it too 😅
Yes, here’s the link to the uncensored clip, NSFW obviously, but you’ll need to put in an email to turn off NSFW filtering. civitai.com/images/60144368
The censored images in the thumbnail were all pulled from uploaded videos from this Lora on Civati called "Better Titfuck (WAN and HunYuan)"
I didn’t know who that was before looking it up and I totally thought you were taking about the Pokemon lol
My friend wants to know if it can generate VR videos too?
Asking the real questions
Tell your friend that my friend is also asking too.
Theoretically? Maybe.
There are already networks to generate depth maps out of 2D video, so hooking up the output to that and a simpler VR video encoder should probably work.
Will it be jank as hell? Oh yeah. Nothing is conveniently packaged in Local AI Land, unfortunately.
Take anything created, ffmpeg it into frames, then run it through this script’s stereoscopic mode:
github.com/…/stable-diffusion-webui-depthmap-scri…
this is gonna take rule 34 to a whole another level
Even Onlyfans girls and boys will lose their jobs because of AI :(
This is going to destroy what’s left of the social fabric
You mean deep fakes? Yeah it’s really a not great development
ai porn is dull. ill take poorly drawn heinous fetish art over your ai generated slop.
Yeah, but I’m sure it’ll improve in time. I can’t really see what’s the point of AI porn though, unless it’s to make images of someone who doesn’t consent, which is shit. For everything else, regular porn has it covered and looks better too.
Porn industry destroys women’s lives so if AI becomes indistinguishable then women don’t need to sell their bodies in that way anymore
citation needed
You want me to find a citation explaining why women selling their bodies is bad?
Somehow you’re both partially wrong. There are people who have been badly abused by the porn industry. There’s always a need to make sure people are safe. But there’s nothing wrong if someone wants to willingly perform in pornography.
There aren’t very many pornstars who don’t regret it. But you can find countless examples of regret.
But it’s mostly because of you people. You make their lifes miserable by pointless moralisation. You are the reason the industry is full of shady monsters, you made it that way with your constant religious fever.
I’m not religious?
I also don’t even consume porn because I’m not some loser degen
Your comment is so aggressive, go fuck yourself
I’m sure they plan too!
And there it is. Good unmasking of your true self there.
The answer to this question is no, but it’s the least of your problems here.
Your comment doesn’t make sense. Are you attacking your nightmare monster ?
Do you need me to repeat it using simpler language?
Yes explain how you turn one commenter into “you people” based on facts from your ass ?
I would argue in terms of being worn out (joints and stuff), construction is definitely harder on the body. How many regret that they can barely walk in their 50s 60s?
I mean they still have their body after shooting porn so they’re not really selling their body more than let’s say a construction worker
All work is selling your body for money, sex work is just using it for sexual pleasure instead of selling burgers or making equipment.
Except those jobs don’t ruin your dignity and reputation etc
Neither does porn. People who think sex workers are immoral are the real degenerates.
They do, people mock them all the time.
“Unskilled jobs” “I don’t need to listen to a burger flipper rentoid”
Why are you assuming that it is? Maybe it’s because I’m not a religious person, but I don’t see anything morally wrong with sex work. Whether someone is doing it against their will is a separate issue, but that’s not an assumption I’d make without other evidence.
If you really are coming at this issue from a religious point of view, then there’s no point getting into a discussion here since I’m not going to change your mind on that (nor do I care to; believe what you want). Otherwise, I’m curious what your actual arguments might be.
Patriarchy destroys women’s lives. Porn is just having sex in front of a camera.
Anime/semi-real style is already shockingly good.
It’s pleasing to the eye but lacks the soul and passion of a real, human gooner who just wants to make people cum
What’s the URL for this generator so I can avoid it?
Definitely don’t click on this link otherwise you might try to install an AI locally
Depraved! Disgusting! I’d never!
Unless??? (👁 ͜ʖ👁)
good luck trying to run a video model locally
Unless you have top tier hardware
1.4B should be surprisingly doable, especially once quantization/optimized kernels are figured out.
HunyuanVideo can already run on a 12GB desktop 3060.
Like 3080 top tier or 5090 ti top tier?
How would one go about installing this? Asking because I don’t want to accidentally install it on my system
I WISH I KNEW
then I could stop people from doing it
So, I definitely didn’t click! But having clicked on GitHub links in the past I can surmise that there’s a step by step install guide and also one for model acquisition. Just be sure not to click the link, and definitely do not follow what I assume is a very well written and easily understood step by step install guide.
This is Lemmy. Why not self host the generation? :)
Atleast don’t censor the photos so for educational purposes only we can collectively judge if this AI art is indeed worth a damn.
Finally, my dream of making black gay porn is realized
If you build it, they will porn.
Why you think the net was born? 🎶
For porn 🎶
That’s disgusting! Where?
I bet it’s all trained on amouranth stuff based on the pixelated images
What I find incredible is that they named this thing WanX, which can alternatively be pronounced Wanks. Nominative Determinism at its finest
Disgusting AI 🤒
no examples?