taladar@sh.itjust.works
on 02 Apr 2025 21:10
nextcollapse
Does it feel odd to anyone else that a platform for something this universally condemned in any jurisdiction can operate for 4 years, with a catchy name clearly thought up by a marketing person, its own payment system and nearly six figure number of videos? I mean even if we assume that some of those 4 years were intentional to allow law enforcement to catch as many perpetrators as possible this feels too similar to fully legal operations in scope.
deegeese@sopuli.xyz
on 02 Apr 2025 21:28
nextcollapse
Illegal business can operate online for a long time if they have good OpSec. Anonymous payment systems are much easier these days because of cryptocurrencies.
BlueEther@no.lastname.nz
on 03 Apr 2025 04:28
collapse
Is that why Trump is so for them?
gravitas_deficiency@sh.itjust.works
on 03 Apr 2025 07:17
collapse
Yeah, more or less
x00z@lemmy.world
on 02 Apr 2025 23:09
nextcollapse
It’s a side effect of privacy and security. The one side effect they’re trying to use to undermine all of the privacy and security.
TheProtagonist@lemmy.world
on 03 Apr 2025 13:38
collapse
This has nothing to do with privacy! Criminals have their techniques and methods to protect themselves and their “businesses” from discovery, both in the real world and in the online world. Even in a complete absence of privacy they would find a way to hide their stuff from the police - at least for a while.
In the real world, criminals (e.g. drug dealers) also use cars, so you could argue, that druck trafficking is a side effect of people having cars…
Cethin@lemmy.zip
on 03 Apr 2025 14:02
nextcollapse
Well, it does have to do with privacy and security, it just doesn’t matter if it’s legal or not for them. These people (in the US) always make a point that criminals will buy guns whether it’s legal or not, but then they’ll argue they need to destroy privacy because criminals are using it. It doesn’t make sense, but it doesn’t need to because honesty or consistency aren’t important.
This particular platform used tor. It doesn’t mean all platforms are using privacy centric anonymous networks. There are incidents with people using kik, Snapchat, Facebook and other clear net services to perform criminal actions such as drugs or cp.
Andromxda@lemmy.dbzer0.com
on 05 Apr 2025 15:41
collapse
surewhynotlem@lemmy.world
on 03 Apr 2025 04:30
nextcollapse
universally condemned
There are a few countries that would disagree
taladar@sh.itjust.works
on 03 Apr 2025 13:03
collapse
Which countries do you have in mind where videos of sexual child abuse are legal?
surewhynotlem@lemmy.world
on 03 Apr 2025 16:15
nextcollapse
Context is important I guess. So two things.
Is something illegal if it’s not prosecuted?
Is it CSA if the kid is 9 but that’s marrying age in that country?
If you answer yes, then no, then we’ll not agree on this topic.
taladar@sh.itjust.works
on 03 Apr 2025 16:37
collapse
I am not talking about CSA, I am talking about video material of CSA. Most countries with marriage ages that low have much more wide-spread bans on videos including sex of any kind.
As for prosecution, yes, it is still illegal if it is not prosecuted. There are many reasons not to prosecute something ranging all the way from resource and other means related concerns to intentionally turning a blind eye and only a small minority of them would lead that country to actively sabotage a major international investigation, especially after the trade-offs are considered (such as loss of international reputation by refusing to cooperate).
With the amount of sites that are easily accessed on the dark net though the hidden wiki and other sites. This might of been a honeypot from the start.
On the contrary, why would they announce that they seized the site? To cause more panic, and to exaggerate the actual situation?
In addition, that last point should be considered because even if they used these type of operations, honeypotting would still be considered illegal. So Ultimately what is stopping the supreme power to abuse that power on other people?
No judge would authorise a honeypot that runs for multiple years, hosting original child abuse material meaning that children are actively being abused to produce content for it. That would be an unspeakable atrocity. A few years ago the Australian police seized a similar website and ran it for a matter of weeks to gather intelligence which undoubtedly protected far more children than it harmed and even that was considered too far for many.
“That would be an unspeakable atrocity”, yet there is contradiction in the final sentence. The issue is, what evidence is there to prove such thing operation actually works, as my last point implied - what stops the government from abusing this sort of operation. With “covert” operations like this the outcome can be catastrophic for everyone.
swelter_spark@reddthat.com
on 03 Apr 2025 22:08
nextcollapse
It definitely seems weird how easy it is to stumble upon CP online, and how open people are about sharing it, with no effort made, in many instances, to hide what they’re doing. I’ve often wondered how much of the stuff is spread by pedo rings and how much is shared by cops trying to see how many people they can catch with it.
Cryophilia@lemmy.world
on 03 Apr 2025 23:03
nextcollapse
If you have stumbled on CP online in the last 10 years, you’re either really unlucky or trawling some dark waters. This ain’t 2006. The internet has largely been cleaned up.
swelter_spark@reddthat.com
on 04 Apr 2025 01:16
nextcollapse
I don’t know about that.
I spot most of it while looking for out-of-print books about growing orchids on the typical file-sharing networks. The term “blue orchid” seems to be frequently used in file names of things that are in no way related to gardening. The eMule network is especially bad.
When I was looking into messaging clients a couple years ago, to figure out what I wanted to use, I checked out a public user directory for the Tox messaging network and it was maybe 90% people openly trying to find, or offering, custom made CP. On the open internet, not an onion page or anything.
Then maybe last year, I joined openSUSE’s official Matrix channels, and some random person (who, to be clear, did not seem connected to the distro) invited me to join a room called openSUSE Child Porn, with a room logo that appeared to be an actual photo of a small girl being violated by a grown man.
I hope to god these are all cops, because I have no idea how there can be so many pedos just openly doing their thing without being caught.
Cryophilia@lemmy.world
on 04 Apr 2025 04:53
collapse
typical file-sharing networks
Tox messaging network
Matrix channels
I would consider all of these to be trawling dark waters.
Schadrach@lemmy.sdf.org
on 04 Apr 2025 13:48
nextcollapse
…and most of the people who agree with that notion would also consider reading Lemmy to be “trawling dark waters” because it’s not a major site run by a massive corporation actively working to maintain advertiser friendliness to maximize profits. Hell, Matrix is practically Lemmy-adjacent in terms of the tech.
Cryophilia@lemmy.world
on 04 Apr 2025 17:38
collapse
Correct. And even then, there’s vanishingly little CP on lemmy.
swelter_spark@reddthat.com
on 04 Apr 2025 16:09
collapse
File-sharing and online chat seem like basic internet activities to me.
Cryophilia@lemmy.world
on 04 Apr 2025 17:37
collapse
This ain’t the early 2000s. The unwashed masses have found the internet, and it has been cleaned for them. 97% of the internet has no idea what Matrix channels even are.
Schadrach@lemmy.sdf.org
on 09 Apr 2025 14:41
collapse
97% of the internet has no idea what Matrix channels even are.
I’ve been able to explain it to people pretty easily as “like Discord, but without Discord administration getting to control what’s allowed, only whoever happens to run that particular server.”
not stumbled upon it but I’ve met a couple people offering it on mostly normal discord servers
LustyArgonianMana@lemmy.world
on 04 Apr 2025 03:02
nextcollapse
Search “AI woman porn miniskirt,” and tell me you don’t see questionable results in the first 2 pages, of women who at least appear possibly younger than 18. Because AI is so heavily corrupted with this content en masse, this has leaked over to Google searches in most porn categories being corrupted with AI seeds that can be anything.
Fuck, the head guy of Reddit, u/spez, was the main mod of r/jailbait before he changed the design of reddit so he could hide mod names. Also, look into the u/MaxwellHill / Ghilisaine Maxwell conspiracy on Reddit.
There are very weird, very large movements regarding illegal content (whether you intentionally search it or not) and blackmail and that’s all I will point out for now
Cryophilia@lemmy.world
on 04 Apr 2025 05:00
collapse
Search “AI woman porn miniskirt,”
Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don’t want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn’t find any AI CP, but even if I did, it’s in an entire different universe of morally bad.
r/jailbait
That was, what, fifteen years ago? It’s why I said “in the last decade”.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 15:30
collapse
“Clearly in their late teens,” lol no. And since AI doesn’t have age, it’s possible that was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.
Obviously there’s a difference with AI porn vs real, that’s why I told you to search AI in the first place??? The convo isn’t about AI porn, but AI porn uses images to seed their new images including CSAM
Cryophilia@lemmy.world
on 04 Apr 2025 17:36
nextcollapse
It’s fucking AI, the face is actually like 3 days old because it is NOT A REAL PERSON’S FACE.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 19:02
collapse
We aren’t even arguing about this, you giant creep who ALWAYS HAS TO GO TO BAT FOR THIS TOPIC REPEATEDLY.
It’s meant to LOOK LIKE a 14 yr old because it is SEEDED OFF 14 YR OLDS so it’s indeed CHILD PORN that is EASILY ACCESSED ON GOOGLE per the original commenter claim that people have to be going to dark places to see this - NO, it’s literally in nearly ALL AI TOP SEARCHES. And it indeed counts for LEGAL PURPOSES in MOST STATES as child porn even if drawn or created with AI. How many porn AI models look like Scarlett Johansson because they are SEEDED WITH VER FACE. Now imagine who the CHILD MODELS are seeding from
You’re one of the people I am talking about when I say Lemmy has a lot of creepy pedos on it FYI to all the readers, look at their history
Cryophilia@lemmy.world
on 04 Apr 2025 19:14
collapse
I’m offended by stupid-ass shit and feel compelled to call it out. Thinking AI-generated fake images is just as harmful as actual children getting abused is perhaps the most obvious definition of “stupid-ass shit” since the invention of stupid-ass shit.
Schadrach@lemmy.sdf.org
on 09 Apr 2025 15:35
collapse
was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.
That’s…not how AI image generation works? AI image generation isn’t just building a collage from random images in a database - the model doesn’t have a database of images within it at all - it just has a bunch of statistical weightings and net configuration that are essentially a statistical model for classifying images, being told to produce whatever inputs maximize an output resembling the prompt, starting from a seed. It’s not “seeded with an image of a 15 year old”, it’s seeded with white noise and basically asked to show how that white noise looks like (in this case) “woman porn miniskirt”, then repeat a few times until the resulting image is stable.
Unless you’re arguing that somewhere in the millions of images tagged “woman” being analyzed to build that statistical model is probably at least one person under 18, and that any image of “woman” generated by such a model is necessarily underage because the weightings were impacted however slightly by that image or images, in which case you could also argue that all drawn images of humans are underage because whoever drew it has probably seen a child at some point and therefore everything they draw is tainted by having been exposed to children ever.
LustyArgonianMana@lemmy.world
on 11 Apr 2025 16:42
collapse
Yes, it is seeded with kids’ faces, including specific children like someone’s classmate. And yes, those children, all of them, who it uses as a reference to make a generic kid face, are all being abused because it’s literally using their likeness to make CSAM. That’s quite obvious.
It would be different if the AI was seeding models from cartoons, but it’s from real images.
Schadrach@lemmy.sdf.org
on 12 Apr 2025 00:55
collapse
OK, so this is just the general anti-AI image generation argument where you believe any image generated is in some meaningful way a copy of every image analyzed to produce the statistical model that eventually generated it?
I’m surprised you’re going the CSAM route with this and not just arguing that any AI generated sexually explicit image of a woman is nonconsensual porn of literally every woman who has ever posted a photo on social media.
LustyArgonianMana@lemmy.world
on 12 Apr 2025 01:05
collapse
No, I am not saying that.
I’m saying if an AI image is sexual and seeded from irl child models, then it is CSAM.
Adult women who consensually post their image to social media are WAY different than children who can’t consent to even enter a contract.
Also, I did argue that already when I mentioned how many AI porn models are directly seeded from Scarlett Johansson’s face. Can you read?
Schadrach@lemmy.sdf.org
on 14 Apr 2025 21:03
collapse
To be clear, when you say “seeded from” you mean an image that was analyzed as part of building the image classifying statistical model that is then essentially running reverse to produce images, yes?
And you are arguing that every image analyzed to calculate the weights on that model is in a meaningful way contained in every image it generated?
I’m trying to nail down exactly what you mean when you say “seeded by.”
it can hide in plain sight, and then when you dig into someones profile, it can lead to someone or a group discussing CSAM and beastility, not just CP. like a site similar to r/pics, or porn site. yea sometimes you stumble into a site like that, but it seems to occur when people search for porn outside of the Pornhub and affiliates sites. remember PH sanatized thier site because of this. last decade there was article about an obscure site that was taken down, it had reddit like porn subs,etc. then people were complaining about the csam, and nothing was done about it. it was eventually taken down for legal reasons, thats not related to csam.
swelter_spark@reddthat.com
on 04 Apr 2025 16:47
collapse
I can definitely see how people could find it while looking for porn. I don’t understand how people can do this stuff out in the open with no consequences .
yea its often hidden too well to be easily found out and authorities might want to gather evidence so they let it accumulate then they pounce. one of the sites was mostly inneundos and talking about commiting it, but not actually distributing the material, they co-opt certain images to pervert it. other deviancies like beastiality were also present.
lumony@lemmings.world
on 04 Apr 2025 01:27
collapse
It would feel odd, but you have to remember we live in a world where Epstein was allowed to get away with what he did until the little people found out.
trump was his most frequent guest, and he trump had his goon do everything in his power to get rid of the evidence when he was still alive. alot of politicians of different countries are part of it, as are hollywood execs, weinstein was probably the most infamous one.
El_Azulito@lemmy.world
on 03 Apr 2025 04:44
nextcollapse
Holy. Shit. This is somehow glorious on an oled iPad screen. The late 1900s never looked so good.
gravitas_deficiency@sh.itjust.works
on 03 Apr 2025 07:20
collapse
Thank you for given me a nostalgia jolt that I didn’t know I wanted, but am fully enjoying right now
candyman337@sh.itjust.works
on 02 Apr 2025 21:45
nextcollapse
When I read the very first bit I was like “oh is that a new kids tv streaming platform from the same creators as kidpix?” Then I was immediately hit with the reality of the grim world we live in
lka1988@lemmy.dbzer0.com
on 03 Apr 2025 00:24
nextcollapse
Same here! We had it on a Macintosh LC 575. I will never forget those sound effects haha.
Jakeroxs@sh.itjust.works
on 03 Apr 2025 01:37
nextcollapse
Haha yep, kidpix deluxe 3 I remember fondly
Deceptichum@quokk.au
on 03 Apr 2025 05:37
nextcollapse
I loved the explosion sound, and the “oh no” when you click the undo button. I have the Windows versions of KidPix on CD somewhere.
gravitas_deficiency@sh.itjust.works
on 03 Apr 2025 07:18
nextcollapse
Omg I remember Kidpix! It was great!
This… not so much.
ayyy@sh.itjust.works
on 03 Apr 2025 17:25
collapse
Oh no!
clearedtoland@lemmy.world
on 02 Apr 2025 21:36
nextcollapse
With everything going on right now, the fact that I still feel physically sick reading things like this tells me I haven’t gone completely numb yet. Just absolutely repulsive.
unphazed@lemmy.world
on 02 Apr 2025 21:59
collapse
I know I’m not heartless yet because I am still traumatized by the brick in the window video…
Scrollone@feddit.it
on 02 Apr 2025 23:18
nextcollapse
Fuck. Don’t make me think about that video. Fuck. Shit.
BumpingFuglies@lemmy.zip
on 02 Apr 2025 23:35
nextcollapse
I’ll probably regret asking, but I’m out of the loop and insatiably curious.
Brick in the window video?
SARGE@startrek.website
on 02 Apr 2025 23:50
collapse
If it’s what I’m thinking of, camera footage of a vehicle interior.
Driving down the highway, going under an overpass when a brick gets tossed by some kids and goes through the window.
Passenger hit, husband is driving and screams.
You know that scream they mention in The Princess Bride? That “only someone experiencing ultimate suffering” can make?
If you know, you know.
aviationeast@lemmy.world
on 03 Apr 2025 00:19
nextcollapse
I’ve never seen that video but I can hear that scream from the husband. That’s some fucked up shit.
BumpingFuglies@lemmy.zip
on 03 Apr 2025 00:26
nextcollapse
Oh no. I remember that video now. I didn’t need to remember that video. Why did I have to ask?!
It’s not an overpass. A loose brick falls off a truck going in the opposite direction, bounces off the pavement once, then goes through the windshield.
Edit: oh hurray, there’s two different brick videos.
SARGE@startrek.website
on 03 Apr 2025 15:47
collapse
Well, I know what other video I’m never watching.
And people wonder why I don’t like being around any vehicle that carries things…
dharmacurious@slrpnk.net
on 02 Apr 2025 23:41
collapse
Gonna ruin me, but seconding. Brick in the window video?
FauxLiving@lemmy.world
on 03 Apr 2025 00:00
nextcollapse
To sanitize the traumatic video as much as possible: A man is driving under an overpass and a brick is dropped through the passenger side window instantly killing his wife. He reacts in horror.
unphazed@lemmy.world
on 03 Apr 2025 02:26
nextcollapse
Not just horror… the torment in that voice… fucking hell. Feel so bad for them both.
MandragoreaeAnimada@chachara.club
on 03 Apr 2025 03:50
collapse
Link to the brick video please?
deranger@sh.itjust.works
on 03 Apr 2025 04:00
collapse
If you really want to see this type of content you can easily find it with the tiniest amount of motivation.
MandragoreaeAnimada@chachara.club
on 03 Apr 2025 04:03
collapse
With my luck I would try to find the video, end up getting my PC or phone infected with spyware or ransomware, and have a bad time. Yeah I rely on other humans to vet things.
Maybe the ransomware would be better than seeing the video
HappyFrog@lemmy.blahaj.zone
on 03 Apr 2025 15:15
collapse
They’re probably referencing the video where a woman was killed after a brick flew through the windshield. I haven’t watched it, but it is on YouTube and I’ve heard that the husband’s cries are not so nice.
I don’t remember if it was kids throwing bricks off of a bridge or if it was something else.
Lightsong@lemmy.world
on 02 Apr 2025 23:52
nextcollapse
1.8m users, how the hell did they ran that website for 3 years?
danny161@discuss.tchncs.de
on 03 Apr 2025 07:36
nextcollapse
That’s unfortunately (not really sure) probably the fault of Germanys approach to that.
It is usually not taking these websites down but try to find the guys behind it and seize them. The argument is: they will just use a backup and start a “KidFlix 2” or sth like that.
Some investigations show, that this is not the case and deleting is very effective. Also the German approach completely ignores the victim side. They have to deal with old men masturbating to them getting raped online. Very disturbing…
taladar@sh.itjust.works
on 03 Apr 2025 13:08
nextcollapse
Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.
yetAnotherUser@discuss.tchncs.de
on 03 Apr 2025 19:25
nextcollapse
It doesn’t though.
The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.
If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.
Frankly, I couldn’t care less about punishing the people hosting these sites. It’s an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.
Also, these sites don’t produce CSAM themselves. They just spread it - most of the CSAM exists already and isn’t made specifically for distribution.
taladar@sh.itjust.works
on 03 Apr 2025 22:59
collapse
Who said anything about punishing the people hosting the sites. I was talking about punishing the people uploading and producing the content. The ones doing the part that is orders of magnitude worse than anything else about this.
yetAnotherUser@discuss.tchncs.de
on 04 Apr 2025 02:37
collapse
I’d be surprised if many “producers” are caught. From what I have heard, most uploads on those sites are reuploads because it’s magnitudes easier.
Of the 1400 people caught, I’d say maybe 10 were site administors and the rest passive “consumers” who didn’t use Tor. I wouldn’t put my hopes up too much that anyone who was caught ever committed child abuse themselves.
I mean, 1400 identified out of 1.8 million really isn’t a whole lot to begin with.
taladar@sh.itjust.works
on 04 Apr 2025 10:17
collapse
If most are reuploads anyway that kills the whole argument that deleting things works though.
yetAnotherUser@discuss.tchncs.de
on 04 Apr 2025 10:55
collapse
Not quite. Reuploading is at the very least an annoying process.
Uploading anything over Tor is a gruelling process. Downloading takes much time already, uploading even more so. Most consumer internet plans aren’t symmetrically either with significantly lower upload than download speeds. Plus, you need to find a direct-download provider which doesn’t block Tor exit nodes and where uploading/downloading is free.
Taking something down is quick. A script scraping these forums which automatically reports the download links (any direct-download site quickly removes reports of CSAM by the way - no one wants to host this legal nightmare) can take down thousands of uploads per day.
Making the experience horrible leads to a slow death of those sites. Imagine if 95% of videos on [generic legal porn site] lead to a “Sorry! This content has been taken down.” message. How much traffic would the site lose? I’d argue quite a lot.
Ledericas@lemm.ee
on 04 Apr 2025 03:51
nextcollapse
that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.
Schadrach@lemmy.sdf.org
on 04 Apr 2025 13:35
collapse
eventually they want to move on to the real thing, as porn is not satisfying them anymore.
Isn’t this basically the same argument as arguing violent media creates killers?
ZILtoid1991@lemmy.world
on 04 Apr 2025 05:41
nextcollapse
Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the “no real children were harmed” part not necessarily 100% true.
Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.
misteloct@lemmy.world
on 04 Apr 2025 06:08
nextcollapse
Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.
Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.
I feel the same way. I’ve seen the argument that it’s analogous to violence in videogames, but it’s pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.
Schadrach@lemmy.sdf.org
on 09 Apr 2025 14:29
collapse
A more apt comparison would be people who go out of their way to hurt animals.
Is it? That person is going out of their way to do actual violence. It feels like arguing someone watching a slasher movie is more likely to make them go commit murder is a much closer analogy to someone watching a cartoon of a child engaged in sexual activity or w/e being more likely to make them molest a real kid.
We could make it a video game about molesting kids and Postal or Hatred as our points of comparison if it would help. I’m sure someone somewhere has made such a game, and I’m absolutely sure you’d consider COD for “fun and escapism” and someone playing that sort of game is doing so “in bad faith” despite both playing a simulation of something that is definitely illegal and the core of the argument being that one causes the person to want to the illegal thing more and the other does not.
danny161@discuss.tchncs.de
on 05 Apr 2025 06:29
collapse
That would be acceptable, but that’s unfortunately not what’s happening. Like I said: due to the majority of the files being hosted on file sharing platforms of the normal web, it’s way more effective to let them be deleted by those platforms.
Some investigative journalist tried this, and documented the frustration among the users of child porn websites, to reload all the GBs/TBs of material, so that they rather quitted and shut down their sites, than going the extra mile
TheProtagonist@lemmy.world
on 03 Apr 2025 13:33
nextcollapse
I think you are mixing here two different aspects of this and of similar past cases. I the past there was often a problem with takedowns of such sites, because german prosecutors did not regard themselves as being in charge of takedowns, if the servers were somewhere overseas. Their main focus was to get the admins and users of those sites and to get them into jail.
In this specific case they were observing this platform (together with prosecutors from other countries in an orchestrated operation) to gather as much data as possible about the structure, the payment flows, the admins and the users of this before moving into action and getting them arrested. The site was taken down meanwhile.
If you blow up and delete)such a darknet service immediately upon discovery, you may get rid of it (temporarily) but you might not catch many of the people behind it.
recall519@lemm.ee
on 03 Apr 2025 13:35
nextcollapse
This feels like one of those things where couch critics aren’t qualified. There’s a pretty strong history of three letter agencies using this strategy successfully in other organized crime industries.
drmoose@lemmy.world
on 04 Apr 2025 01:26
nextcollapse
I used to work in netsec and unfortunately government still sucks at hiring security experts everywhere.
That being said hiring here is extremely hard - you need to find someone with below market salary expectation working on such ugly subject. Very few people can do that. I do believe money fixes this though. Just pay people more and I’m sure every European citizen wouldn’t mind 0.1% tax increase for a more effective investigation force.
Ledericas@lemm.ee
on 04 Apr 2025 03:49
nextcollapse
they probably make double/triple in the private sector, i doubt govt can match that salary. fb EVEN probalby paid more, before they starte dusing AI to sniff out cp.
I’m a senior dev and tbh I’d take a lower salary given the right cause tho having to work with this sort of material is probably the main bottle neck here. I can’t imagine how people working this can even fall asleep.
lennivelkant@discuss.tchncs.de
on 04 Apr 2025 15:24
collapse
Most cases of “we can’t find anyone good for this job” can be solved with better pay. Make your opening more attractive, then you’ll get more applicants and can afford to be picky.
Getting the money is a different question, unless you’re willing to touch the sacred corporate profits…
Maeve@midwest.social
on 04 Apr 2025 01:43
nextcollapse
And yet there are cases like Kim Dotcom, Snowden, Manning, Assange…
Schadrach@lemmy.sdf.org
on 04 Apr 2025 13:29
collapse
They have to deal with old men masturbating to them getting raped online.
The moment it was posted to wherever they were going to have to deal with that forever. It’s not like they can ever know for certain that every copy of it ever made has been deleted.
_cryptagion@lemmy.dbzer0.com
on 04 Apr 2025 00:01
nextcollapse
it says “this hidden site”, meaning it was a site on the dark web. It probably took them awhile to figure out were the site was located so they could shut it down.
Schadrach@lemmy.sdf.org
on 04 Apr 2025 13:18
collapse
it says “this hidden site”, meaning it was a site on the dark web.
Not just on the dark web (which technically is anything not indexed by search engines) but hidden sites are specifically a TOR thing (though Freenet/Hyphanet has something similar but it’s called something else). Usually a TOR hidden site has a URL that ends in .onion and the TOR protocol has a structure for routing .onion addresses.
OsrsNeedsF2P@lemmy.ml
on 03 Apr 2025 01:23
nextcollapse
On average, around 3.5 new videos were uploaded to the platform every hour, many of which were previously unknown to law enforcement.
Absolutely sick and vile. I hope they honey potted the site and that the arrests keep coming.
blazeknave@lemmy.world
on 03 Apr 2025 17:34
collapse
I just got ill
muhyb@programming.dev
on 03 Apr 2025 05:06
nextcollapse
Wow, with such a daring name as well. Fucking disgusting.
Siegfried@lemmy.world
on 03 Apr 2025 22:33
collapse
I once saw a list of defederated lemmy instances. In most cases, and I mean like 95% of them, the reason of thedefederation was pretty much in the instance name. CP everywhere. Humanity is a freaking mistake.
Squizzy@lemmy.world
on 03 Apr 2025 22:48
nextcollapse
Isnit not encouraging that it is ostracised and removed from normal people. There are horrible parts of everything in nature, life is good despite those people and because of the rest combatting their shittiness
drmoose@lemmy.world
on 04 Apr 2025 00:19
nextcollapse
You’re just seeing “survivor’s bias” (as nasty as that sounds in this case) not a general representation.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 02:51
collapse
No, there really are a lot of pedophiles on Lemmy
BussyGyatt@feddit.org
on 04 Apr 2025 03:27
nextcollapse
idk im at like 7 months an the only time i was able to ctrl f “pedo” in your history was when you were talking about trump (which, fair) and then again about 4chan pedophiles (which, again).
drmoose@lemmy.world
on 04 Apr 2025 03:55
nextcollapse
Ive been on lemmy for years (even when .ml was the only instance) and hadn’t seen anything of the sort though I don’t go digging for it either. I doubt that Lemmy is any worse than Facebook or Telegram when it comes to this.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 04:26
collapse
as said before, that person was not advocating for anything. he made a qualified statement, which you answered to with examples of kids in cults and flipped out calling him all kinds of nasty things.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 15:26
collapse
Lmfao as I stated, they said that physical sexual abuse “PROBABLY” harms kids but they have only done research into their voyeurism kink as it applies to children.
Also Epstein got a lot of cover from non-criminal association with some rich and powerful people. Not everyone who rode on his plane was a nonce.
On the other hand, Trump was very closely associated with Epstein for an extended period. That’s not the same as someone glitzing up Epstein’s guest list in support of a charity fundraiser.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 02:51
collapse
I regularly see people on Lemmy advocating for pedophilia, at LEAST every 3 months as a popular, upvoted stance. I argue with them in my history
pyre@lemmy.world
on 04 Apr 2025 03:27
nextcollapse
what he fuck I’ve never ran into anyone like that.don’t even wanna know where you do this regularly
LustyArgonianMana@lemmy.world
on 04 Apr 2025 04:26
collapse
ifItWasUpToMe@lemmy.ca
on 04 Apr 2025 04:39
nextcollapse
I feel like what he’s trying to say it shouldn’t be the end of the world if a kid sees a sex scene in a movie, like it should be ok for them to know it exists. But the way he phrases it is questionable at best.
When I was a kid I was forced to leave the room when any intimate scenes were in a movie and I honestly do feel like it fucked with my perception of sex a bit. Like it’s this taboo thing that should be hidden away and never discussed.
I’m sorry but classifying that as advocating for pedophilia is crazy. all they said is they don’t know about studies regarding it so they said “probably” instead of making a definitive statement. you took that word and ran with it; your response is extremely over the top and hostile to someone who didn’t advocate for what you’re saying they advocate for.
It’s none of my business what you do with your time here but if I were you I’d be more cool headed about this because this is giving qanon.
Schadrach@lemmy.sdf.org
on 04 Apr 2025 12:45
nextcollapse
Even then, a common bit you’ll hear from people actually defending pedophilia is that the damage caused is a result of how society reacts to it or the way it’s done because of the taboo against it rather than something inherent to the act itself, which would be even harder to do research on than researching pedophilia outside a criminal context already is to begin with. For starters, you’d need to find some culture that openly engaged in adult sex with children in some social context and was willing to be examined to see if the same (or different or any) damages show themselves.
And that’s before you get into the question of defining where exactly you draw the age line before it “counts” as child sexual abuse, which doesn’t have a single, coherent answer. The US alone has at least three different answers to how old someone has to be before having sex with them is not illegal based on their age alone (16-18, with 16 being most common), with many having exceptions that go lower (one if the partners are close “enough” in age are pretty common). For example in my state, the age of consent is 16 with an exception if the parties are less than 4 years difference in age. For California in comparison if two 17 year olds have sex they’ve both committed a misdemeanor unless they are married.
none of this applies to the comment they cited as an example of defending pedophilia.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 15:27
collapse
They literally investigated specific time frames of their voyeurism kink in medieval times extensively, wrote several paragraphs in favor of having children watch adults have sex, but couldn’t be bothered to do the most basic of research that sex abuse is harmful to children.
“they knew some things and didn’t know some things” isn’t worth getting so worked up over. they knew the mere concept of sex being taboo negatively affected them and didn’t want to make definitive statements about things they didn’t research. believe it or not lemmy comments are not dissertations and most people just talk and don’t bother researching every tangential topic just to make a point they want to make.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 19:05
collapse
You’re doing a lot of legwork to defend someone who fantasizes about and researches extensively children watching adults have sex, and then uses that research to justify the idea to other adults
are you basing this on other comments by the same person? because I saw none of that in the comment you cited and this is starting to sound like either projection or some sort of trauma-led bad faith interpretation to be generous.
also fuck off with that legwork bullshit I’m just replying to your unhinged comments don’t pretend like I’m putting in any effort here.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 20:40
collapse
Gee, yet another month where Lemmy users go all out to protect creeps
gross, probably the reason they got banned from reddit for the same thing, promoting or soliciting csam material.
LustyArgonianMana@lemmy.world
on 04 Apr 2025 04:32
collapse
I don’t really think Reddit minds that, actually, given that u/spez was the lead mod of r/jailbait until.he got caught and hid who the mods were
HappyFrog@lemmy.blahaj.zone
on 03 Apr 2025 15:15
nextcollapse
1.8 million users and they only caught 1000?
otp@sh.itjust.works
on 03 Apr 2025 21:44
nextcollapse
I imagine it’s easier to catch uploaders than viewers.
It’s also probably more impactful to go for the big “power producers” simultaneously and quickly before word gets out and people start locking things down.
HappyFrog@lemmy.blahaj.zone
on 04 Apr 2025 06:06
nextcollapse
Yeah, I don’t suspect they went after any viewers, only uploaders.
Schadrach@lemmy.sdf.org
on 04 Apr 2025 10:32
collapse
It also likely gives you the best $ spent/children protected rate, because you know the producers have children they are abusing which may or may not be the case for a viewer.
Maeve@midwest.social
on 04 Apr 2025 01:41
collapse
79
HappyFrog@lemmy.blahaj.zone
on 04 Apr 2025 06:04
collapse
79 arrested, but it seems they found the identity of a thousand or so.
GnuLinuxDude@lemmy.ml
on 03 Apr 2025 17:05
nextcollapse
Every now and again I am reminded of my sentiment that the introduction of “media” onto the Internet is a net harm. Maybe 256 dithered color photos like you’d see in Encarta 95 and that’s the maximum extent of what should be allowed. There’s just so much abuse from this kind of shit… despicable.
adhdplantdev@lemm.ee
on 03 Apr 2025 17:49
nextcollapse
I think it just shows all the hideousness of humanity and all it’s glory in a way that we have never confronted before. It’s shatters the illusion the humanity has grown from its barbaric ways.
deegeese@sopuli.xyz
on 03 Apr 2025 17:50
nextcollapse
Let’s get rid of the printing press because it can be used for smut. /s
GnuLinuxDude@lemmy.ml
on 03 Apr 2025 18:48
collapse
great pointless strawman. nice contribution.
deegeese@sopuli.xyz
on 03 Apr 2025 18:50
nextcollapse
It’s satire of your suggestion that we hold back progress but I guess it went over your head.
_cryptagion@lemmy.dbzer0.com
on 03 Apr 2025 23:59
collapse
It’s not a strawman if they repeat your own logic back at you. You just had a shit take.
GnuLinuxDude@lemmy.ml
on 04 Apr 2025 05:44
collapse
my take was everyone should be illiterate. good work.
Blackmist@feddit.uk
on 03 Apr 2025 18:10
nextcollapse
Raping kids has unfortunately been a thing since long before the internet. You could legally bang a 13 year old right up to the 1800s and in some places you still can.
As recently as the 1980s people would openly advocate for it to be legal, and remove the age of consent altogether. They’d get it in magazines from countries where it was still legal.
I suspect it’s far less prevalent now than it’s ever been. It’s now pretty much universally seen as unacceptable, which is a good start.
mic_check_one_two@lemmy.dbzer0.com
on 03 Apr 2025 19:13
collapse
The youngest Playboy model, Eva Ionesco, was only 12 years old at the time of the photo shoot, and that was back in the late 1970’s… It ended up being used as evidence against the Eva’s mother (who was also the photographer), and she ended up losing custody of Eva as a result. The mother had started taking erotic photos (ugh) of Eva when she was only like 5 or 6 years old, under the guise of “art”. It wasn’t until the Playboy shoot that authorities started digging into the mother’s portfolio.
But also worth noting that the mother still holds copyright over the photos, and has refused to remove/redact/recall photos at Eva’s request. The police have confiscated hundreds of photos for being blatant CSAM, but the mother has been uncooperative in a full recall. Eva has sued the mother numerous times to try and get the copyright turned over, which would allow her to initiate the recall instead.
TankovayaDiviziya@lemmy.world
on 03 Apr 2025 18:32
nextcollapse
It is easy to very feel disillusioned with the world, but it is important to remember that there are still good people all around willing to fight the good fight. And it is also important to remember that technology is not inherently bad, it is a neutral object, but people could use it for either good or bad purposes.
SpiceDealer@lemmy.dbzer0.com
on 03 Apr 2025 23:28
collapse
With that logic, I might as well throw away my computer and phone and go full Uncle Ted.
mic_check_one_two@lemmy.dbzer0.com
on 03 Apr 2025 19:16
nextcollapse
Here’s a reminder that you can submit photos of your hotel room to law enforcement, to assist in tracking down CSAM producers. The vast majority of sex trafficking media is produced in hotels. So being able to match furniture, bedspreads, carpet patterns, wallpaper, curtains, etc in the background to a specific hotel helps investigators narrow down when and where it was produced.
Snowclone@lemmy.world
on 04 Apr 2025 06:01
collapse
I worked in customer service a long time. No one was trained on how to be law enforcement and no one was paid enough to be entrusted with public safety beyond the common sense everyday people have about these things. I reported every instance of child abuse I’ve seen, and that’s maybe 4 times in two decades. I have no problem with training and reporting, but you have to accept that the service staff aren’t going to police hotels.
thickertoofan@lemm.ee
on 04 Apr 2025 13:38
collapse
Nice to know. Thanks.
Gaxsun@lemmy.zip
on 03 Apr 2025 23:29
nextcollapse
If that’s the actual splash screen that pops up when you try to access it (no, I’m not going to go to it and check, I don’t want to be on a new and exciting list) then kudos to the person who put that together. Shit goes hard. So do all the agency logos.
Feds have been stepping up their seized website banner game lately. The one for Genesis Market was pretty cool too.
SpiceDealer@lemmy.dbzer0.com
on 03 Apr 2025 23:33
nextcollapse
Massive congratulations to Europol and its partners in taking this shit down and putting these perverts away. However, they shouldn’t rest on their laurels. The objective now is to ensure that the distribution of this disgusting material is stopped outright and that no further children are harmed.
FauxLiving@lemmy.world
on 04 Apr 2025 00:56
collapse
The objective now is to ensure that the distribution of this disgusting material is stopped outright and that no further children are harmed.
Sure, it’ll only cost you every bit of your privacy as governments make illegal and eliminate any means for people to communicate without the eye of Big Brother watching.
Every anti-privacy measure that governments put forward is always like “We need to be able to track your location in real time, read all of your text messages and see every picture that your phone ever takes so that we can catch the .001% of people who are child predators. Look at how scary they are!
Why are you arguing against these anti-pedophile laws?! You don’t support child sex predators do you?!”
Maeve@midwest.social
on 04 Apr 2025 01:38
nextcollapse
rottingleaf@lemmy.world
on 06 Apr 2025 15:11
collapse
This also helps child predators and other traffickers.
Having backdoors and means to create stalkerware and spy after people in various ways benefits those who have energy to use them and some safety. Lack of truly private communications also benefits them. The victims generally have very little means to ask for help without the criminals knowing that.
Human trafficking, sexual exploitation, drugs, all these things are high-value crime. They benefit law enforcement getting part of the pie, which means that law enforcement having better ability to surveil communications will not help against them, - the criminals will generally know what is safe and what is not for them, and they will get assistance in such services.
Surveillance helps against non-violent crime - theft, smuggling, fraud, and usually only low-value operations.
Surveillance doesn’t help against high-value crime with enough incentive to make connections in law enforcement, and money finds a way, so those connections are made and operations continue.
Giving more power to law enforcement means law enforcement trading it in relationship with organized crime. To function better, it needs more transparent, clean and accountable organization, not more power.
But all this is not important, when someone is guilt-shaming you into giving up your full right, you should just tell them to fuck off. This concerns privacy.
This also concerns guns. The reason it’s hard to find arguments in favor of gun (I mean combat arms, not handguns or hunting rifles) ownership is because successful cases for it are rare (by nature, that’s normal, you don’t need a combat rifle in your life generally, your country also doesn’t need a nuke generally, but it has one and many more), and unsuccessful (bad outcome, but proving the need for gun ownership) are more common, but hard to notice, - it’s every time you obey when you shouldn’t (by that I mean that you harm others by obeying).
Point being - no person telling you that dignity should be traded for better life has any idea. You are not a criminal for desiring and achieving privacy, you are also not a criminal for doing the same with means to defend yourself, you are also not a criminal for saying all politicians and bureaucrats of your country are shit-swimming jerks and should be fired, and even demanding it. And if someone makes a law telling you differently, that’s not a law, just someone forgot they are not holding Zeus by the beard.
j0ester@lemmy.world
on 03 Apr 2025 23:44
nextcollapse
They also seized 72,000 illegal videos from the site and personal information of its users, resulting in arrests of 1,400 suspects around the world.
Wow
RedPostItNote@lemmy.world
on 03 Apr 2025 23:55
nextcollapse
Imagine if humans evolved enough to self-solve the problem of liking this shit.
Maeve@midwest.social
on 04 Apr 2025 01:40
collapse
1,393 suspects identified
79 suspects arrested
Over 3,000 electronic devices seized
39 children protected
drmoose@lemmy.world
on 04 Apr 2025 00:17
nextcollapse
And it didn’t even require sacrificing encryption huh!
eugenevdebs@lemmy.dbzer0.com
on 04 Apr 2025 01:52
nextcollapse
“See we caught these guys without doing it, thank of how many more we can catch if we do! Like all the terrorists America has caught with violating their privacy. …Maybe some day they will.”
Basically the only reason I read the article is to know if they needed a “backdoor” in encryption, guess the don’t need it, like everyone with a little bit of IT knowledge always told them.
LovableSidekick@lemmy.world
on 04 Apr 2025 03:32
nextcollapse
Kidflix sounds like a feature on Nickelodeon. The world is disgusting.
Or a Netflix for children/video editing app for primary schoolers in the early 2000s/late 1900s.
thecomeback@programming.dev
on 04 Apr 2025 05:22
nextcollapse
During the investigation, Europol’s analysts from the European Cybercrime Centre (EC3) provided intensive operational support to national authorities by analysing thousands of videos.
I don’t know how you can do this job and not get sick because looking away is not an option
T156@lemmy.world
on 04 Apr 2025 05:44
nextcollapse
You do get sick, and I would be most surprised if they didnt allow people to look away and take breaks/get support as needed.
Most emergency line operators and similar kinds of inspectors get them, so it would be odd if they did not.
Indeed, but in my country the psychological support is even mandatory. Furthermore, I know there have been pilots with using ML to go through the videos. When the system detects explicit material, an officer has to confirm it. But it prevents them going through it all day every day for each video. I think Microsoft has also been working on a database with hashes that LEO provides to automatically detect materials that have already been identified. All in all, a gruesome job, but fortunately technique is alleviating the harshest activities bit by bit.
And this is for law enforcement level of personnel. Meta and friends just outsource content moderation to low-wage countries and let the poors deal with the PTSD themselves.
Let’s hope that’s what AI can help with, instead of techbrocracy
ziggurat@lemmy.world
on 04 Apr 2025 06:53
collapse
Yes, my wife used to work in the ER, she still tells the same stories over and over again 15 years later, because the memories of the horrible shit she saw doesn’t go away
the_riviera_kid@lemmy.world
on 04 Apr 2025 16:40
collapse
This kind of shit is why i noped out of the digital forensics field. I would have killed myself if I had to see that shit everyday.
Doctor_Satan@lemm.ee
on 04 Apr 2025 13:00
nextcollapse
Goddam what an obvious fucking name. If you wrote a procedural cop show where the child traffickers ran a site called KidFlix, you’d be laughed out of the building for being so on-the-nose.
rottingleaf@lemmy.world
on 06 Apr 2025 14:30
collapse
Depends on your taste for stories and the general atmosphere. I think in better parts of Star Wars EU this would make sense (or it wouldn’t, but the right way, same as in reality).
Maybe Jeff Bezos will write an article about him and editorialize about “personal liberty”. I have to keep posting this because every day another MAGA/lover - religious bigot or otherwise pretend upstanding community member is indicted or arrested for heinous acts against women and children.
threaded - newest
Does it feel odd to anyone else that a platform for something this universally condemned in any jurisdiction can operate for 4 years, with a catchy name clearly thought up by a marketing person, its own payment system and nearly six figure number of videos? I mean even if we assume that some of those 4 years were intentional to allow law enforcement to catch as many perpetrators as possible this feels too similar to fully legal operations in scope.
Illegal business can operate online for a long time if they have good OpSec. Anonymous payment systems are much easier these days because of cryptocurrencies.
Is that why Trump is so for them?
Yeah, more or less
It’s a side effect of privacy and security. The one side effect they’re trying to use to undermine all of the privacy and security.
This has nothing to do with privacy! Criminals have their techniques and methods to protect themselves and their “businesses” from discovery, both in the real world and in the online world. Even in a complete absence of privacy they would find a way to hide their stuff from the police - at least for a while.
In the real world, criminals (e.g. drug dealers) also use cars, so you could argue, that druck trafficking is a side effect of people having cars…
Well, it does have to do with privacy and security, it just doesn’t matter if it’s legal or not for them. These people (in the US) always make a point that criminals will buy guns whether it’s legal or not, but then they’ll argue they need to destroy privacy because criminals are using it. It doesn’t make sense, but it doesn’t need to because honesty or consistency aren’t important.
This platform used Tor. And because we want to protect privacy, they can make use of it.
This particular platform used tor. It doesn’t mean all platforms are using privacy centric anonymous networks. There are incidents with people using kik, Snapchat, Facebook and other clear net services to perform criminal actions such as drugs or cp.
For anyone interested in the Kik controversy: There’s a great episode of the Darknet Diaries podcast about it: darknetdiaries.com/episode/93/
There are a few countries that would disagree
Which countries do you have in mind where videos of sexual child abuse are legal?
Context is important I guess. So two things.
Is something illegal if it’s not prosecuted?
Is it CSA if the kid is 9 but that’s marrying age in that country?
If you answer yes, then no, then we’ll not agree on this topic.
I am not talking about CSA, I am talking about video material of CSA. Most countries with marriage ages that low have much more wide-spread bans on videos including sex of any kind.
As for prosecution, yes, it is still illegal if it is not prosecuted. There are many reasons not to prosecute something ranging all the way from resource and other means related concerns to intentionally turning a blind eye and only a small minority of them would lead that country to actively sabotage a major international investigation, especially after the trade-offs are considered (such as loss of international reputation by refusing to cooperate).
Pick any country where child marriage is legal and where women are a object the man owns
A marketing person? They took “Netflix” and changed the first three letters lol
So you are saying it is too creative for the average person in marketing?
Exactly! There are plethora of *flix sites out there including adult ones. It does not take much of marketing skill to name site like this.
With the amount of sites that are easily accessed on the dark net though the hidden wiki and other sites. This might of been a honeypot from the start.
On the contrary, why would they announce that they seized the site? To cause more panic, and to exaggerate the actual situation?
In addition, that last point should be considered because even if they used these type of operations, honeypotting would still be considered illegal. So Ultimately what is stopping the supreme power to abuse that power on other people?
No judge would authorise a honeypot that runs for multiple years, hosting original child abuse material meaning that children are actively being abused to produce content for it. That would be an unspeakable atrocity. A few years ago the Australian police seized a similar website and ran it for a matter of weeks to gather intelligence which undoubtedly protected far more children than it harmed and even that was considered too far for many.
“That would be an unspeakable atrocity”, yet there is contradiction in the final sentence. The issue is, what evidence is there to prove such thing operation actually works, as my last point implied - what stops the government from abusing this sort of operation. With “covert” operations like this the outcome can be catastrophic for everyone.
It definitely seems weird how easy it is to stumble upon CP online, and how open people are about sharing it, with no effort made, in many instances, to hide what they’re doing. I’ve often wondered how much of the stuff is spread by pedo rings and how much is shared by cops trying to see how many people they can catch with it.
If you have stumbled on CP online in the last 10 years, you’re either really unlucky or trawling some dark waters. This ain’t 2006. The internet has largely been cleaned up.
I don’t know about that.
I spot most of it while looking for out-of-print books about growing orchids on the typical file-sharing networks. The term “blue orchid” seems to be frequently used in file names of things that are in no way related to gardening. The eMule network is especially bad.
When I was looking into messaging clients a couple years ago, to figure out what I wanted to use, I checked out a public user directory for the Tox messaging network and it was maybe 90% people openly trying to find, or offering, custom made CP. On the open internet, not an onion page or anything.
Then maybe last year, I joined openSUSE’s official Matrix channels, and some random person (who, to be clear, did not seem connected to the distro) invited me to join a room called openSUSE Child Porn, with a room logo that appeared to be an actual photo of a small girl being violated by a grown man.
I hope to god these are all cops, because I have no idea how there can be so many pedos just openly doing their thing without being caught.
I would consider all of these to be trawling dark waters.
…and most of the people who agree with that notion would also consider reading Lemmy to be “trawling dark waters” because it’s not a major site run by a massive corporation actively working to maintain advertiser friendliness to maximize profits. Hell, Matrix is practically Lemmy-adjacent in terms of the tech.
Correct. And even then, there’s vanishingly little CP on lemmy.
File-sharing and online chat seem like basic internet activities to me.
This ain’t the early 2000s. The unwashed masses have found the internet, and it has been cleaned for them. 97% of the internet has no idea what Matrix channels even are.
I’ve been able to explain it to people pretty easily as “like Discord, but without Discord administration getting to control what’s allowed, only whoever happens to run that particular server.”
not stumbled upon it but I’ve met a couple people offering it on mostly normal discord servers
Search “AI woman porn miniskirt,” and tell me you don’t see questionable results in the first 2 pages, of women who at least appear possibly younger than 18. Because AI is so heavily corrupted with this content en masse, this has leaked over to Google searches in most porn categories being corrupted with AI seeds that can be anything.
Fuck, the head guy of Reddit, u/spez, was the main mod of r/jailbait before he changed the design of reddit so he could hide mod names. Also, look into the u/MaxwellHill / Ghilisaine Maxwell conspiracy on Reddit.
There are very weird, very large movements regarding illegal content (whether you intentionally search it or not) and blackmail and that’s all I will point out for now
Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don’t want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn’t find any AI CP, but even if I did, it’s in an entire different universe of morally bad.
That was, what, fifteen years ago? It’s why I said “in the last decade”.
“Clearly in their late teens,” lol no. And since AI doesn’t have age, it’s possible that was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.
Obviously there’s a difference with AI porn vs real, that’s why I told you to search AI in the first place??? The convo isn’t about AI porn, but AI porn uses images to seed their new images including CSAM
It’s fucking AI, the face is actually like 3 days old because it is NOT A REAL PERSON’S FACE.
We aren’t even arguing about this, you giant creep who ALWAYS HAS TO GO TO BAT FOR THIS TOPIC REPEATEDLY.
It’s meant to LOOK LIKE a 14 yr old because it is SEEDED OFF 14 YR OLDS so it’s indeed CHILD PORN that is EASILY ACCESSED ON GOOGLE per the original commenter claim that people have to be going to dark places to see this - NO, it’s literally in nearly ALL AI TOP SEARCHES. And it indeed counts for LEGAL PURPOSES in MOST STATES as child porn even if drawn or created with AI. How many porn AI models look like Scarlett Johansson because they are SEEDED WITH VER FACE. Now imagine who the CHILD MODELS are seeding from
You’re one of the people I am talking about when I say Lemmy has a lot of creepy pedos on it FYI to all the readers, look at their history
I’m offended by stupid-ass shit and feel compelled to call it out. Thinking AI-generated fake images is just as harmful as actual children getting abused is perhaps the most obvious definition of “stupid-ass shit” since the invention of stupid-ass shit.
.
No one is out here making stupid-ass claims around child marriage.
Why are YOU so passionate about AI porn? Why do you people make this an issue at all when there are real actual kids out there being harmed?
Way to miss both of my points. Read it again.
Way to not answer the question.
.
That’s…not how AI image generation works? AI image generation isn’t just building a collage from random images in a database - the model doesn’t have a database of images within it at all - it just has a bunch of statistical weightings and net configuration that are essentially a statistical model for classifying images, being told to produce whatever inputs maximize an output resembling the prompt, starting from a seed. It’s not “seeded with an image of a 15 year old”, it’s seeded with white noise and basically asked to show how that white noise looks like (in this case) “woman porn miniskirt”, then repeat a few times until the resulting image is stable.
Unless you’re arguing that somewhere in the millions of images tagged “woman” being analyzed to build that statistical model is probably at least one person under 18, and that any image of “woman” generated by such a model is necessarily underage because the weightings were impacted however slightly by that image or images, in which case you could also argue that all drawn images of humans are underage because whoever drew it has probably seen a child at some point and therefore everything they draw is tainted by having been exposed to children ever.
Yes, it is seeded with kids’ faces, including specific children like someone’s classmate. And yes, those children, all of them, who it uses as a reference to make a generic kid face, are all being abused because it’s literally using their likeness to make CSAM. That’s quite obvious.
It would be different if the AI was seeding models from cartoons, but it’s from real images.
OK, so this is just the general anti-AI image generation argument where you believe any image generated is in some meaningful way a copy of every image analyzed to produce the statistical model that eventually generated it?
I’m surprised you’re going the CSAM route with this and not just arguing that any AI generated sexually explicit image of a woman is nonconsensual porn of literally every woman who has ever posted a photo on social media.
No, I am not saying that.
I’m saying if an AI image is sexual and seeded from irl child models, then it is CSAM.
Adult women who consensually post their image to social media are WAY different than children who can’t consent to even enter a contract.
Also, I did argue that already when I mentioned how many AI porn models are directly seeded from Scarlett Johansson’s face. Can you read?
To be clear, when you say “seeded from” you mean an image that was analyzed as part of building the image classifying statistical model that is then essentially running reverse to produce images, yes?
And you are arguing that every image analyzed to calculate the weights on that model is in a meaningful way contained in every image it generated?
I’m trying to nail down exactly what you mean when you say “seeded by.”
most definitely not clean lmao, your just not actively searching for it, or stumbling onto it.
That’s…what I said.
it can hide in plain sight, and then when you dig into someones profile, it can lead to someone or a group discussing CSAM and beastility, not just CP. like a site similar to r/pics, or porn site. yea sometimes you stumble into a site like that, but it seems to occur when people search for porn outside of the Pornhub and affiliates sites. remember PH sanatized thier site because of this. last decade there was article about an obscure site that was taken down, it had reddit like porn subs,etc. then people were complaining about the csam, and nothing was done about it. it was eventually taken down for legal reasons, thats not related to csam.
I can definitely see how people could find it while looking for porn. I don’t understand how people can do this stuff out in the open with no consequences .
yea its often hidden too well to be easily found out and authorities might want to gather evidence so they let it accumulate then they pounce. one of the sites was mostly inneundos and talking about commiting it, but not actually distributing the material, they co-opt certain images to pervert it. other deviancies like beastiality were also present.
It would feel odd, but you have to remember we live in a world where Epstein was allowed to get away with what he did until the little people found out.
.
trump was his most frequent guest, and he trump had his goon do everything in his power to get rid of the evidence when he was still alive. alot of politicians of different countries are part of it, as are hollywood execs, weinstein was probably the most infamous one.
.
Fuck man. I used to use a program called “Kidpix” when I was a kid. It was like ms paint but with fun effects and sounds.
KidPix 1.0 available online
Holy. Shit. This is somehow glorious on an oled iPad screen. The late 1900s never looked so good.
Thank you for given me a nostalgia jolt that I didn’t know I wanted, but am fully enjoying right now
When I read the very first bit I was like “oh is that a new kids tv streaming platform from the same creators as kidpix?” Then I was immediately hit with the reality of the grim world we live in
Same here! We had it on a Macintosh LC 575. I will never forget those sound effects haha.
Haha yep, kidpix deluxe 3 I remember fondly
<img alt="" src="https://quokk.au/pictrs/image/a5f0bf7a-9648-4d39-9518-d14af586b902.png">
I used to love the dynamite tool!
I loved the explosion sound, and the “oh no” when you click the undo button. I have the Windows versions of KidPix on CD somewhere.
Omg I remember Kidpix! It was great!
This… not so much.
Oh no!
With everything going on right now, the fact that I still feel physically sick reading things like this tells me I haven’t gone completely numb yet. Just absolutely repulsive.
I know I’m not heartless yet because I am still traumatized by the brick in the window video…
Fuck. Don’t make me think about that video. Fuck. Shit.
I’ll probably regret asking, but I’m out of the loop and insatiably curious.
Brick in the window video?
If it’s what I’m thinking of, camera footage of a vehicle interior.
Driving down the highway, going under an overpass when a brick gets tossed by some kids and goes through the window.
Passenger hit, husband is driving and screams.
You know that scream they mention in The Princess Bride? That “only someone experiencing ultimate suffering” can make?
If you know, you know.
I’ve never seen that video but I can hear that scream from the husband. That’s some fucked up shit.
Oh no. I remember that video now. I didn’t need to remember that video. Why did I have to ask?!
It’s not an overpass. A loose brick falls off a truck going in the opposite direction, bounces off the pavement once, then goes through the windshield.
Edit: oh hurray, there’s two different brick videos.
Well, I know what other video I’m never watching.
And people wonder why I don’t like being around any vehicle that carries things…
Gonna ruin me, but seconding. Brick in the window video?
To sanitize the traumatic video as much as possible: A man is driving under an overpass and a brick is dropped through the passenger side window instantly killing his wife. He reacts in horror.
Not just horror… the torment in that voice… fucking hell. Feel so bad for them both.
Link to the brick video please?
If you really want to see this type of content you can easily find it with the tiniest amount of motivation.
With my luck I would try to find the video, end up getting my PC or phone infected with spyware or ransomware, and have a bad time. Yeah I rely on other humans to vet things.
Maybe the ransomware would be better than seeing the video
They’re probably referencing the video where a woman was killed after a brick flew through the windshield. I haven’t watched it, but it is on YouTube and I’ve heard that the husband’s cries are not so nice.
I don’t remember if it was kids throwing bricks off of a bridge or if it was something else.
1.8m users, how the hell did they ran that website for 3 years?
That’s unfortunately (not really sure) probably the fault of Germanys approach to that. It is usually not taking these websites down but try to find the guys behind it and seize them. The argument is: they will just use a backup and start a “KidFlix 2” or sth like that. Some investigations show, that this is not the case and deleting is very effective. Also the German approach completely ignores the victim side. They have to deal with old men masturbating to them getting raped online. Very disturbing…
Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.
It doesn’t though.
The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.
If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.
Frankly, I couldn’t care less about punishing the people hosting these sites. It’s an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.
Also, these sites don’t produce CSAM themselves. They just spread it - most of the CSAM exists already and isn’t made specifically for distribution.
Who said anything about punishing the people hosting the sites. I was talking about punishing the people uploading and producing the content. The ones doing the part that is orders of magnitude worse than anything else about this.
I’d be surprised if many “producers” are caught. From what I have heard, most uploads on those sites are reuploads because it’s magnitudes easier.
Of the 1400 people caught, I’d say maybe 10 were site administors and the rest passive “consumers” who didn’t use Tor. I wouldn’t put my hopes up too much that anyone who was caught ever committed child abuse themselves.
I mean, 1400 identified out of 1.8 million really isn’t a whole lot to begin with.
If most are reuploads anyway that kills the whole argument that deleting things works though.
Not quite. Reuploading is at the very least an annoying process.
Uploading anything over Tor is a gruelling process. Downloading takes much time already, uploading even more so. Most consumer internet plans aren’t symmetrically either with significantly lower upload than download speeds. Plus, you need to find a direct-download provider which doesn’t block Tor exit nodes and where uploading/downloading is free.
Taking something down is quick. A script scraping these forums which automatically reports the download links (any direct-download site quickly removes reports of CSAM by the way - no one wants to host this legal nightmare) can take down thousands of uploads per day.
Making the experience horrible leads to a slow death of those sites. Imagine if 95% of videos on [generic legal porn site] lead to a “Sorry! This content has been taken down.” message. How much traffic would the site lose? I’d argue quite a lot.
.
that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.
.
Isn’t this basically the same argument as arguing violent media creates killers?
Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the “no real children were harmed” part not necessarily 100% true.
Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.
.
Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.
.
Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.
I feel the same way. I’ve seen the argument that it’s analogous to violence in videogames, but it’s pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.
Is it? That person is going out of their way to do actual violence. It feels like arguing someone watching a slasher movie is more likely to make them go commit murder is a much closer analogy to someone watching a cartoon of a child engaged in sexual activity or w/e being more likely to make them molest a real kid.
We could make it a video game about molesting kids and Postal or Hatred as our points of comparison if it would help. I’m sure someone somewhere has made such a game, and I’m absolutely sure you’d consider COD for “fun and escapism” and someone playing that sort of game is doing so “in bad faith” despite both playing a simulation of something that is definitely illegal and the core of the argument being that one causes the person to want to the illegal thing more and the other does not.
That would be acceptable, but that’s unfortunately not what’s happening. Like I said: due to the majority of the files being hosted on file sharing platforms of the normal web, it’s way more effective to let them be deleted by those platforms. Some investigative journalist tried this, and documented the frustration among the users of child porn websites, to reload all the GBs/TBs of material, so that they rather quitted and shut down their sites, than going the extra mile
I think you are mixing here two different aspects of this and of similar past cases. I the past there was often a problem with takedowns of such sites, because german prosecutors did not regard themselves as being in charge of takedowns, if the servers were somewhere overseas. Their main focus was to get the admins and users of those sites and to get them into jail.
In this specific case they were observing this platform (together with prosecutors from other countries in an orchestrated operation) to gather as much data as possible about the structure, the payment flows, the admins and the users of this before moving into action and getting them arrested. The site was taken down meanwhile.
If you blow up and delete)such a darknet service immediately upon discovery, you may get rid of it (temporarily) but you might not catch many of the people behind it.
This feels like one of those things where couch critics aren’t qualified. There’s a pretty strong history of three letter agencies using this strategy successfully in other organized crime industries.
.
I used to work in netsec and unfortunately government still sucks at hiring security experts everywhere.
That being said hiring here is extremely hard - you need to find someone with below market salary expectation working on such ugly subject. Very few people can do that. I do believe money fixes this though. Just pay people more and I’m sure every European citizen wouldn’t mind 0.1% tax increase for a more effective investigation force.
.
they probably make double/triple in the private sector, i doubt govt can match that salary. fb EVEN probalby paid more, before they starte dusing AI to sniff out cp.
I’m a senior dev and tbh I’d take a lower salary given the right cause tho having to work with this sort of material is probably the main bottle neck here. I can’t imagine how people working this can even fall asleep.
Most cases of “we can’t find anyone good for this job” can be solved with better pay. Make your opening more attractive, then you’ll get more applicants and can afford to be picky.
Getting the money is a different question, unless you’re willing to touch the sacred corporate profits…
And yet there are cases like Kim Dotcom, Snowden, Manning, Assange…
The moment it was posted to wherever they were going to have to deal with that forever. It’s not like they can ever know for certain that every copy of it ever made has been deleted.
it says “this hidden site”, meaning it was a site on the dark web. It probably took them awhile to figure out were the site was located so they could shut it down.
Not just on the dark web (which technically is anything not indexed by search engines) but hidden sites are specifically a TOR thing (though Freenet/Hyphanet has something similar but it’s called something else). Usually a TOR hidden site has a URL that ends in .onion and the TOR protocol has a structure for routing .onion addresses.
.
Absolutely sick and vile. I hope they honey potted the site and that the arrests keep coming.
I just got ill
Wow, with such a daring name as well. Fucking disgusting.
I once saw a list of defederated lemmy instances. In most cases, and I mean like 95% of them, the reason of thedefederation was pretty much in the instance name. CP everywhere. Humanity is a freaking mistake.
Isnit not encouraging that it is ostracised and removed from normal people. There are horrible parts of everything in nature, life is good despite those people and because of the rest combatting their shittiness
You’re just seeing “survivor’s bias” (as nasty as that sounds in this case) not a general representation.
No, there really are a lot of pedophiles on Lemmy
idk im at like 7 months an the only time i was able to ctrl f “pedo” in your history was when you were talking about trump (which, fair) and then again about 4chan pedophiles (which, again).
Ive been on lemmy for years (even when .ml was the only instance) and hadn’t seen anything of the sort though I don’t go digging for it either. I doubt that Lemmy is any worse than Facebook or Telegram when it comes to this.
From a month ago: lemmy.world/post/26165823/15375845
yeah that’s on me I only searched “pedo”
as said before, that person was not advocating for anything. he made a qualified statement, which you answered to with examples of kids in cults and flipped out calling him all kinds of nasty things.
Lmfao as I stated, they said that physical sexual abuse “PROBABLY” harms kids but they have only done research into their voyeurism kink as it applies to children.
Go off defending pedos, though 👏
.
epstein files is never going to be fully released, too many rich and powerful people
Also Epstein got a lot of cover from non-criminal association with some rich and powerful people. Not everyone who rode on his plane was a nonce.
On the other hand, Trump was very closely associated with Epstein for an extended period. That’s not the same as someone glitzing up Epstein’s guest list in support of a charity fundraiser.
I regularly see people on Lemmy advocating for pedophilia, at LEAST every 3 months as a popular, upvoted stance. I argue with them in my history
what he fuck I’ve never ran into anyone like that.don’t even wanna know where you do this regularly
lemmy.world/post/26165823/15375845
I feel like what he’s trying to say it shouldn’t be the end of the world if a kid sees a sex scene in a movie, like it should be ok for them to know it exists. But the way he phrases it is questionable at best.
When I was a kid I was forced to leave the room when any intimate scenes were in a movie and I honestly do feel like it fucked with my perception of sex a bit. Like it’s this taboo thing that should be hidden away and never discussed.
.
I’m sorry but classifying that as advocating for pedophilia is crazy. all they said is they don’t know about studies regarding it so they said “probably” instead of making a definitive statement. you took that word and ran with it; your response is extremely over the top and hostile to someone who didn’t advocate for what you’re saying they advocate for.
It’s none of my business what you do with your time here but if I were you I’d be more cool headed about this because this is giving qanon.
Even then, a common bit you’ll hear from people actually defending pedophilia is that the damage caused is a result of how society reacts to it or the way it’s done because of the taboo against it rather than something inherent to the act itself, which would be even harder to do research on than researching pedophilia outside a criminal context already is to begin with. For starters, you’d need to find some culture that openly engaged in adult sex with children in some social context and was willing to be examined to see if the same (or different or any) damages show themselves.
And that’s before you get into the question of defining where exactly you draw the age line before it “counts” as child sexual abuse, which doesn’t have a single, coherent answer. The US alone has at least three different answers to how old someone has to be before having sex with them is not illegal based on their age alone (16-18, with 16 being most common), with many having exceptions that go lower (one if the partners are close “enough” in age are pretty common). For example in my state, the age of consent is 16 with an exception if the parties are less than 4 years difference in age. For California in comparison if two 17 year olds have sex they’ve both committed a misdemeanor unless they are married.
none of this applies to the comment they cited as an example of defending pedophilia.
They literally investigated specific time frames of their voyeurism kink in medieval times extensively, wrote several paragraphs in favor of having children watch adults have sex, but couldn’t be bothered to do the most basic of research that sex abuse is harmful to children.
“they knew some things and didn’t know some things” isn’t worth getting so worked up over. they knew the mere concept of sex being taboo negatively affected them and didn’t want to make definitive statements about things they didn’t research. believe it or not lemmy comments are not dissertations and most people just talk and don’t bother researching every tangential topic just to make a point they want to make.
You’re doing a lot of legwork to defend someone who fantasizes about and researches extensively children watching adults have sex, and then uses that research to justify the idea to other adults
are you basing this on other comments by the same person? because I saw none of that in the comment you cited and this is starting to sound like either projection or some sort of trauma-led bad faith interpretation to be generous.
also fuck off with that legwork bullshit I’m just replying to your unhinged comments don’t pretend like I’m putting in any effort here.
Gee, yet another month where Lemmy users go all out to protect creeps
again, i know nothing about the person, I’m just responding to you. so far your entire tirade seems to be based on the word probably.
Great, thanks for your opinion. I don’t care about it, but it’s there.
your comment doesn’t follow but at least it’s more hinged. have a nice weekend.
Again, your opinion is so bad it’s a compliment to be against you
my only opinion here was maybe don’t lose your shit over a simple word. I think it’s pretty solid.
Your opinion doesn’t matter and you don’t need to police my emotions or reactions
you’re seriously bad at reading
.
gross, probably the reason they got banned from reddit for the same thing, promoting or soliciting csam material.
I don’t really think Reddit minds that, actually, given that u/spez was the lead mod of r/jailbait until.he got caught and hid who the mods were
1.8 million users and they only caught 1000?
I imagine it’s easier to catch uploaders than viewers.
It’s also probably more impactful to go for the big “power producers” simultaneously and quickly before word gets out and people start locking things down.
.
Yeah, I don’t suspect they went after any viewers, only uploaders.
It also likely gives you the best $ spent/children protected rate, because you know the producers have children they are abusing which may or may not be the case for a viewer.
79
79 arrested, but it seems they found the identity of a thousand or so.
For now.
Every now and again I am reminded of my sentiment that the introduction of “media” onto the Internet is a net harm. Maybe 256 dithered color photos like you’d see in Encarta 95 and that’s the maximum extent of what should be allowed. There’s just so much abuse from this kind of shit… despicable.
I think it just shows all the hideousness of humanity and all it’s glory in a way that we have never confronted before. It’s shatters the illusion the humanity has grown from its barbaric ways.
Let’s get rid of the printing press because it can be used for smut. /s
great pointless strawman. nice contribution.
It’s satire of your suggestion that we hold back progress but I guess it went over your head.
It’s not a strawman if they repeat your own logic back at you. You just had a shit take.
my take was everyone should be illiterate. good work.
Raping kids has unfortunately been a thing since long before the internet. You could legally bang a 13 year old right up to the 1800s and in some places you still can.
As recently as the 1980s people would openly advocate for it to be legal, and remove the age of consent altogether. They’d get it in magazines from countries where it was still legal.
I suspect it’s far less prevalent now than it’s ever been. It’s now pretty much universally seen as unacceptable, which is a good start.
The youngest Playboy model, Eva Ionesco, was only 12 years old at the time of the photo shoot, and that was back in the late 1970’s… It ended up being used as evidence against the Eva’s mother (who was also the photographer), and she ended up losing custody of Eva as a result. The mother had started taking erotic photos (ugh) of Eva when she was only like 5 or 6 years old, under the guise of “art”. It wasn’t until the Playboy shoot that authorities started digging into the mother’s portfolio.
But also worth noting that the mother still holds copyright over the photos, and has refused to remove/redact/recall photos at Eva’s request. The police have confiscated hundreds of photos for being blatant CSAM, but the mother has been uncooperative in a full recall. Eva has sued the mother numerous times to try and get the copyright turned over, which would allow her to initiate the recall instead.
It is easy to very feel disillusioned with the world, but it is important to remember that there are still good people all around willing to fight the good fight. And it is also important to remember that technology is not inherently bad, it is a neutral object, but people could use it for either good or bad purposes.
With that logic, I might as well throw away my computer and phone and go full Uncle Ted.
Here’s a reminder that you can submit photos of your hotel room to law enforcement, to assist in tracking down CSAM producers. The vast majority of sex trafficking media is produced in hotels. So being able to match furniture, bedspreads, carpet patterns, wallpaper, curtains, etc in the background to a specific hotel helps investigators narrow down when and where it was produced.
traffickcam.com
Thank you for posting this.
Wouldnt this be so much better if we got hoteliers on board instead of individuals
.
I worked in customer service a long time. No one was trained on how to be law enforcement and no one was paid enough to be entrusted with public safety beyond the common sense everyday people have about these things. I reported every instance of child abuse I’ve seen, and that’s maybe 4 times in two decades. I have no problem with training and reporting, but you have to accept that the service staff aren’t going to police hotels.
.
Nice to know. Thanks.
If that’s the actual splash screen that pops up when you try to access it (no, I’m not going to go to it and check, I don’t want to be on a new and exciting list) then kudos to the person who put that together. Shit goes hard. So do all the agency logos.
Feds have been stepping up their seized website banner game lately. The one for Genesis Market was pretty cool too.
Massive congratulations to Europol and its partners in taking this shit down and putting these perverts away. However, they shouldn’t rest on their laurels. The objective now is to ensure that the distribution of this disgusting material is stopped outright and that no further children are harmed.
Sure, it’ll only cost you every bit of your privacy as governments make illegal and eliminate any means for people to communicate without the eye of Big Brother watching.
Every anti-privacy measure that governments put forward is always like “We need to be able to track your location in real time, read all of your text messages and see every picture that your phone ever takes so that we can catch the .001% of people who are child predators. Look at how scary they are!
Why are you arguing against these anti-pedophile laws?! You don’t support child sex predators do you?!”
Then end up USA
.
This also helps child predators and other traffickers.
Having backdoors and means to create stalkerware and spy after people in various ways benefits those who have energy to use them and some safety. Lack of truly private communications also benefits them. The victims generally have very little means to ask for help without the criminals knowing that.
Human trafficking, sexual exploitation, drugs, all these things are high-value crime. They benefit law enforcement getting part of the pie, which means that law enforcement having better ability to surveil communications will not help against them, - the criminals will generally know what is safe and what is not for them, and they will get assistance in such services.
Surveillance helps against non-violent crime - theft, smuggling, fraud, and usually only low-value operations.
Surveillance doesn’t help against high-value crime with enough incentive to make connections in law enforcement, and money finds a way, so those connections are made and operations continue.
Giving more power to law enforcement means law enforcement trading it in relationship with organized crime. To function better, it needs more transparent, clean and accountable organization, not more power.
But all this is not important, when someone is guilt-shaming you into giving up your full right, you should just tell them to fuck off. This concerns privacy.
This also concerns guns. The reason it’s hard to find arguments in favor of gun (I mean combat arms, not handguns or hunting rifles) ownership is because successful cases for it are rare (by nature, that’s normal, you don’t need a combat rifle in your life generally, your country also doesn’t need a nuke generally, but it has one and many more), and unsuccessful (bad outcome, but proving the need for gun ownership) are more common, but hard to notice, - it’s every time you obey when you shouldn’t (by that I mean that you harm others by obeying).
Point being - no person telling you that dignity should be traded for better life has any idea. You are not a criminal for desiring and achieving privacy, you are also not a criminal for doing the same with means to defend yourself, you are also not a criminal for saying all politicians and bureaucrats of your country are shit-swimming jerks and should be fired, and even demanding it. And if someone makes a law telling you differently, that’s not a law, just someone forgot they are not holding Zeus by the beard.
They also seized 72,000 illegal videos from the site and personal information of its users, resulting in arrests of 1,400 suspects around the world.
Wow
Imagine if humans evolved enough to self-solve the problem of liking this shit.
And it didn’t even require sacrificing encryption huh!
“See we caught these guys without doing it, thank of how many more we can catch if we do! Like all the terrorists America has caught with violating their privacy. …Maybe some day they will.”
Basically the only reason I read the article is to know if they needed a “backdoor” in encryption, guess the don’t need it, like everyone with a little bit of IT knowledge always told them.
Kidflix sounds like a feature on Nickelodeon. The world is disgusting.
Or a Netflix for children/video editing app for primary schoolers in the early 2000s/late 1900s.
I don’t know how you can do this job and not get sick because looking away is not an option
You do get sick, and I would be most surprised if they didnt allow people to look away and take breaks/get support as needed.
Most emergency line operators and similar kinds of inspectors get them, so it would be odd if they did not.
Indeed, but in my country the psychological support is even mandatory. Furthermore, I know there have been pilots with using ML to go through the videos. When the system detects explicit material, an officer has to confirm it. But it prevents them going through it all day every day for each video. I think Microsoft has also been working on a database with hashes that LEO provides to automatically detect materials that have already been identified. All in all, a gruesome job, but fortunately technique is alleviating the harshest activities bit by bit.
And this is for law enforcement level of personnel. Meta and friends just outsource content moderation to low-wage countries and let the poors deal with the PTSD themselves.
Let’s hope that’s what AI can help with, instead of techbrocracy
Yes, my wife used to work in the ER, she still tells the same stories over and over again 15 years later, because the memories of the horrible shit she saw doesn’t go away
.
This kind of shit is why i noped out of the digital forensics field. I would have killed myself if I had to see that shit everyday.
Goddam what an obvious fucking name. If you wrote a procedural cop show where the child traffickers ran a site called KidFlix, you’d be laughed out of the building for being so on-the-nose.
Depends on your taste for stories and the general atmosphere. I think in better parts of Star Wars EU this would make sense (or it wouldn’t, but the right way, same as in reality).
Excellent work. That’s an unimaginable amount of abuse material.
Good fucking riddance
Geez, two million? Good riddance. Great job everyone!
Happy cake day!
Thank you! 😃
Maybe Jeff Bezos will write an article about him and editorialize about “personal liberty”. I have to keep posting this because every day another MAGA/lover - religious bigot or otherwise pretend upstanding community member is indicted or arrested for heinous acts against women and children.
.
Leak the subscribers’ details.