Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business (www.cnn.com)
from btaf45@lemmy.world to technology@lemmy.world on 20 Mar 2024 01:26
https://lemmy.world/post/13323235

#technology

threaded - newest

autotldr@lemmings.world on 20 Mar 2024 01:30 next collapse

This is the best summary I could come up with:


A New York state judge on Monday denied a motion to dismiss a lawsuit against several social media companies alleging the platforms contributed to the radicalization of a gunman who killed 10 people at a grocery store in Buffalo, New York in 2022, court documents show.

In her decision, the judge said that the plaintiffs may proceed with their lawsuit, which claims social media companies — like Meta, Alphabet, Reddit and 4chan — ”profit from the racist, antisemitic, and violent material displayed on their platforms to maximize user engagement,” including the time then 18-year-old Payton Gendron spent on their platforms viewing that material.

“They allege they are sophisticated products designed to be addictive to young users and they specifically directed Gendron to further platforms or postings that indoctrinated him with ‘white replacement theory’,” the decision read.

“It is far too early to rule as a matter of law that the actions, or inaction, of the social media/internet defendants through their platforms require dismissal,” said the judge.

“While we disagree with today’s decision and will be appealing, we will continue to work with law enforcement, other platforms, and civil society to share intelligence and best practices,” the statement said.

We are constantly evaluating ways to improve our detection and removal of this content, including through enhanced image-hashing systems, and we will continue to review the communities on our platform to ensure they are upholding our rules.”


The original article contains 407 words, the summary contains 229 words. Saved 44%. I’m a bot and I’m open source!

Minotaur@lemm.ee on 20 Mar 2024 01:51 next collapse

I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.

It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.

Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.

It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.

Zak@lemmy.world on 20 Mar 2024 02:17 next collapse

I think the design of media products around maximally addictive individually targeted algorithms in combination with content the platform does not control and isn’t responsible for is dangerous. Such an algorithm will find the people most susceptible to everything from racist conspiracy theories to eating disorder content and show them more of that. Attempts to moderate away the worst examples of it just result in people making variations that don’t technically violate the rules.

With that said, laws made and legal precedents set in response to tragedies are often ill-considered, and I don’t like this case. I especially don’t like that it includes Reddit, which was not using that type of individualized algorithm to my knowledge.

deweydecibel@lemmy.world on 20 Mar 2024 02:59 next collapse

Attempts to moderate away the worst examples of it just result in people making variations that don’t technically violate the rules.

The problem then becomes if the clearly defined rules aren’t enough, then the people that run these sites need to start making individual judgment calls based on…well, their gut, really. And that creates a lot of issues if the site in question could be held accountable for making a poor call or overlooking something.

The threat of legal repercussions hanging over them is going to make them default to the most strict actions, and that’s kind of a problem if there isn’t a clear definition of what things need to be actioned against.

VirtualOdour@sh.itjust.works on 20 Mar 2024 03:15 next collapse

It’s the chilling effect they use in China, don’t make it clear what will get you in trouble and then people are too scared to say anything

Just another group looking to control expression by the back door

rambaroo@lemmynsfw.com on 20 Mar 2024 09:40 collapse

There’s nothing ambiguous about this. Give me a break. We’re demanding that social media companies stop deliberately driving negativity and extremism to get clicks. This has fuck all to do with free speech. What they’re doing isn’t “free speech”, it’s mass manipulation, and it’s very deliberate. And it isn’t disclosed to users at any point, which also makes it fraudulent.

It’s incredibly ironic that you’re accusing people of an effort to control expression when that’s literally what social media has been doing since the beginning. They’re the ones trying to turn the world into a dystopia, not the other way around.

rambaroo@lemmynsfw.com on 20 Mar 2024 09:34 next collapse

Bullshit. There’s no slippery slope here. You act like these social media companies just stumbled onto algorithms. They didn’t, they designed these intentionally to drive engagement up.

Demanding that they change their algorithms to stop intentionally driving negativity and extremism isn’t dystopian at all, and it’s very frustrating that you think it is. If you choose to do nothing about this issue I promise you we’ll be living in a fascist nation within 10 years, and it won’t be an accident.

bigMouthCommie@kolektiva.social on 20 Mar 2024 16:30 collapse

this is exactly why section 230 exists. sites aren't responsible for what other people post and they are allowed to moderate however they want.

refurbishedrefurbisher@lemmy.sdf.org on 20 Mar 2024 06:02 next collapse

This is the real shit right here. The problem is that social media companies’ data show that negativity and hate keep people on their website for longer, which means that they view more advertisement compared to positivity.

It is human nature to engage with disagreeable topics moreso than agreeable topics, and social media companies are exploiting that for profit.

We need to regulate algorithms and force them to be open source, so that anybody can audit them. They will try to hide behind “AI” and “trade secret” excuses, but lawmakers have to see above that bullshit.

Unfortunately, US lawmakers are both stupid and corrupt, so it’s unlikely that we’ll see proper change, and more likely that we’ll see shit like “banning all social media from foreign adversaries” when the US-based social media companies are largely the cause of all these problems. I’m sure the US intelligence agencies don’t want them to change either, since those companies provide large swaths of personal data to them.

admin@lemmy.my-box.dev on 20 Mar 2024 08:58 collapse

While this is true for Facebook and YouTube - last time I checked, reddit doesn’t personalise feeds in that way. It was my impression that if two people subscribe to the same subreddits, they will see the exact same posts, based on time and upvotes.

Then again, I only ever used third party apps and old.reddit.com, so that might have changed since then.

deweydecibel@lemmy.world on 20 Mar 2024 11:42 next collapse

It’s probably not true anymore, but at the time this guy was being radicalized, you’re right, it wasn’t algorithmically catered to them. At least not in the sense that it was intentionally exposing them to a specific type of content.

I suppose you can think of the way reddit works (or used to work) as being content agnostic. The algorithm is not aware of the sorts of things it’s suggesting to you, it’s just showing you things based on subreddit popularity and user voting, regardless of what it is.

In the case of YouTube and Facebook, their algorithms are taking into account the actual content and funneling you towards similar content algorithmically, in a way that is unique to you. Which means at some point their algorithm is acknowledging “this content has problematic elements, let’s suggest more problematic content”

(Again, modern reddit, at least on the app, is likely engaging in this now to some degree)

cophater69@lemm.ee on 20 Mar 2024 12:05 collapse

That’s a lot of baseless suppositions you have there. Stuff you cannot possibly know - like how reddit content algos work.

cophater69@lemm.ee on 20 Mar 2024 11:57 collapse

Mate, I never got the same homepage twice on my old reddit account. I dunno how you can claim that two people with identical subs would see the same page. That’s just patently not true and hasn’t been for years.

admin@lemmy.my-box.dev on 20 Mar 2024 12:24 collapse

Quite simple, aniki. The feeds were ordered by hot, new, or top.

New was ORDER BY date DESC. Top was ORDER BY upvotes DESC. And hot was a slightly more complicated order that used a mixture of upvotes and time.

You can easily verify this by opening 2 different browsers in incognito mode and go to the old reddit frontpage - I get the same results in either. Again - I can’t account for the new reddit site because I never used it for more than a few minutes, but that’s definitely how they old one worked and still seems to.

rambaroo@lemmynsfw.com on 20 Mar 2024 09:29 collapse

Reddit is the same thing. They intentionally enable and cultivate hostility and bullying there to drive up engagement.

deweydecibel@lemmy.world on 20 Mar 2024 11:37 collapse

But not algorithmically catered to the individual.

Kalysta@lemmy.world on 20 Mar 2024 19:24 collapse

Which is even worse because more people see the bullying and hatred, especially when it shows up on a default sub.

morrowind@lemmy.ml on 20 Mar 2024 02:21 next collapse

Do you not think if someone encouraged a murderer they should be held accountable? It’s not everyone they interacted with, there has to be reasonable suspicion they contributed.

Also I’m pretty sure this is nothing new

deweydecibel@lemmy.world on 20 Mar 2024 02:41 next collapse

Depends on what you mean by “encouraged”. That is going to need a very precise definition in these cases.

And the point isn’t that people shouldn’t be held accountable, it’s that there are a lot of gray areas here, we need to be careful how we navigate them. Irresponsible rulings or poorly implemented laws can destabilize everything that makes the internet worthwhile.

VirtualOdour@sh.itjust.works on 20 Mar 2024 03:19 next collapse

Everyone on lemmy who makes guillotine jokes will enjoy their life sentence I’m sure

PhlubbaDubba@lemm.ee on 20 Mar 2024 04:45 next collapse

Is there currently a national crisis of Jacobins kidnapping oligarchs and beheading them in public I am unaware of?

morrowind@lemmy.ml on 20 Mar 2024 08:02 collapse

No

Unfortunately

rambaroo@lemmynsfw.com on 20 Mar 2024 09:53 collapse

Literally no one suggested that end users should be arrested for jokes on the internet. Fuck off with your attempts at trying to distract from the real issue.

Minotaur@lemm.ee on 20 Mar 2024 04:26 collapse

I didn’t say that at all, and I think you know I didn’t unless you really didn’t actually read my comment.

I am not talking about encouraging someone to murder. I specifically said that in overt cases there is some common sense civil responsibility. I am talking about the potential for the the police to break down your door because you Facebook messaged a guy you’re friends with what your favorite local gun store was, and that guy also happens to listen to death metal and take antidepressants and the state has deemed him a risk factor level 3.

morrowind@lemmy.ml on 20 Mar 2024 07:39 collapse

I must have misunderstood you then, but this still seems like a pretty clear case where the platforms, not even people yet did encourage him. I don’t think there’s any new precedent being set here

Minotaur@lemm.ee on 20 Mar 2024 16:40 collapse

Rulings often start at the corporation / large major entity level and work their way down to the individual. Think piracy laws. At first, only giant, clear bootlegging operations were really prosecuted for that, and then people torrenting content for profit, and then people torrenting large amounts of content for free - and now we currently exist in an environment where you can torrent a movie or whatever and probably be fine, but also if the criminal justice system wants to they can (and have) easily hit anyone who does with a charge for tens of thousands of dollars or years of jail time.

Will it happen to the vast majority of people who torrent media casually? No. But we currently exist in an environment where if you get unlucky enough or someone wants to punish you for it enough, you can essentially have this massive sentence handed down to you almost “at random”.

snooggums@midwest.social on 20 Mar 2024 02:39 next collapse

Systemic problems require systemic solutions.

Minotaur@lemm.ee on 20 Mar 2024 04:23 collapse

Sure, and I get that for like, healthcare. But ‘systemic solutions’ as they pertain to “what constitutes a crime” lead to police states really quickly imo

rambaroo@lemmynsfw.com on 20 Mar 2024 09:57 collapse

The article is about lawsuits. Where are you getting this idea that anyone suggested criminalizing people? Stop putting words in other people’s mouths. The most that’s been suggested in this thread is regulating social media algorithms, not locking people up.

Drop the melodrama and paranoia. It’s getting difficult to take you seriously when you keep making shit up about other people’s positions.

Minotaur@lemm.ee on 20 Mar 2024 12:21 collapse

I don’t believe you’ve had a lot of experience with the US legal system

galoisghost@aussie.zone on 20 Mar 2024 03:08 next collapse

Nah. This isn’t guilt by association

In her decision, the judge said that the plaintiffs may proceed with their lawsuit, which claims social media companies — like Meta, Alphabet, Reddit and 4chan — ”profit from the racist, antisemitic, and violent material displayed on their platforms to maximize user engagement,”

Which despite their denials the actually know: nbcnews.com/…/facebook-knew-radicalized-users-rcn…

deweydecibel@lemmy.world on 20 Mar 2024 03:13 next collapse

Also worth remembering, this opens up avenues for lawsuits on other types of “harm”.

We have states that have outlawed abortion. What do those sites do when those states argue social media should be “held accountable” for all the women who are provided information on abortion access through YouTube, Facebook, reddit, etc?

dgriffith@aussie.zone on 20 Mar 2024 03:24 next collapse

This appears to be more the angle of the person being fed an endless stream of hate on social media and thus becoming radicalised.

What causes them to be fed an endless stream of hate? Algorithms. Who provides those algorithms? Social media companies. Why do they do this? To maintain engagement with their sites so they can make money via advertising.

And so here we are, with sites that see you viewed 65 percent of a stream showing an angry mob, therefore you would like to see more angry mobs in your feed. Is it any wonder that shit like this happens?

PhlubbaDubba@lemm.ee on 20 Mar 2024 04:43 next collapse

It’s also known to intentionally show you content that’s likely to provoke you into fights online

Which just makes all the sanctimonious screed about avoiding echo chambers a bunch of horse shit, because that’s not how outside digital social behavior works, outside the net if you go out of your way to keep arguing with people who wildly disagree with you, your not avoiding echo chambers, you’re building a class action restraining order case against yourself.

Monument@lemmy.sdf.org on 20 Mar 2024 11:26 next collapse

I’ve long held this hunch that when people’s beliefs are challenged, they tend to ‘dig in’ and wind up more resolute. (I think it’s actual science and I learned that in a sociology class many years ago but it’s been so long I can’t say with confidence if that’s the case.)

Assuming my hunch is right (or at least right enough), I think that side of social media - driving up engagement by increasing discord also winds up radicalizing people as a side effect of chasing profits.

It’s one of the things I appreciate about Lemmy. Not everyone here seems to just be looking for a fight all the time.

Kalysta@lemmy.world on 20 Mar 2024 19:18 next collapse

It depends on how their beliefs are challenged. Calling them morons won’t work. You have to gently question them about their ideas and not seem to be judging them.

Monument@lemmy.sdf.org on 20 Mar 2024 21:24 collapse

Oh, yeah, absolutely. Another commenter on this post suggested my belief on it was from an Oatmeal comic. That prompted me to search it out, and seeing it spelled out again sort of opened up the memory for me.

The class was a sociology class about 20 years ago, and the professor was talking about cognitive dissonance as it relates to folks choosing whether or not they wanted to adopt the beliefs of another group. I don’t think he got into how to actually challenge beliefs in a constructive way, since he was discussing how seemingly small rifts can turn into big disagreements between social groups, but subsequent life experience and a lot of good articles about folks working with radicals to reform their beliefs confirm exactly what you commented.

Eccitaze@yiffit.net on 20 Mar 2024 20:32 collapse

You may have gotten this very belief from this comic

Monument@lemmy.sdf.org on 20 Mar 2024 21:26 collapse

Nah. I picked that up about 20 years ago, but the comic is a great one.
I haven’t read The Oatmeal in a while. I guess I know what I’ll be doing later tonight!

deweydecibel@lemmy.world on 20 Mar 2024 11:53 collapse

People have been fighting online long before algorithmic content suggestions. They may amplify it, but you can’t blame that on them entirely.

The truth is many people would argue and fight like that in real life if they could be anonymous.

Eldritch@lemmy.world on 21 Mar 2024 20:55 collapse

Absolutely. Huge difference between hate speech existing. And funneling a firehose of it at someone to keep them engaged. It’s not clear how this will shake out. But I doubt it will be the end of free speech. If it exists and you actively seek it out that’s something else.

PhlubbaDubba@lemm.ee on 20 Mar 2024 04:38 next collapse

I dunno about social media companies but I quite agree that the party who got the gunman the gun should share the punishment for the crime.

Firearms should be titled and insured, and the owner should have an imposed duty to secure, and the owner ought to face criminal penalty if the firearm titled to them was used by someone else to commit a crime, either they handed a killer a loaded gun or they inadequately secured a firearm which was then stolen to be used in committing a crime, either way they failed their responsibility to society as a firearm owner and must face consequences for it.

solrize@lemmy.world on 20 Mar 2024 05:01 next collapse

This guy seems to have bought the gun legally at a gun store, after filling out the forms and passing the background check. You may be thinking of the guy in Maine whose parents bought him a gun when he was obviously dangerous. They were just convicted of involuntary manslaughter for that, iirc.

PhlubbaDubba@lemm.ee on 20 Mar 2024 05:31 collapse

Yup, I was just addressing the point of tangential arrest, sometimes it is well justified.

solrize@lemmy.world on 20 Mar 2024 05:37 collapse

Well you were talking about charging the gun owner if someone else commits a crime with their gun. That’s unrelated to this case where the shooter was the gun owner.

The lawsuit here is about radicalization but if we’re pursuing companies who do that, I’d start with Fox News.

Minotaur@lemm.ee on 20 Mar 2024 05:13 collapse

If you lend your brother, who you know is on antidepressants, a long extension cord he tells you is for his back patio - and he hangs himself with it, are you ready to be accused of being culpable for your brothers death?

PhlubbaDubba@lemm.ee on 20 Mar 2024 05:35 next collapse

Did he also use it as improvised ammunition to shoot up the local elementary school with the chord to warrant it being considered a firearm?

I’m more confused where I got such a lengthy extension chord from! Am I an event manager? Do I have generators I’m running cable from? Do I get to meet famous people on the job? Do I specialize in fairground festivals?

Minotaur@lemm.ee on 20 Mar 2024 12:31 collapse

…. Aside from everything else, are you under the impression that a 10-15 ft extension cord is an odd thing to own…?

rambaroo@lemmynsfw.com on 20 Mar 2024 10:01 next collapse

Knowingly manipulating people into suicide is a crime and people have already been found guilty of doing it.

So the answer is obvious. If you knowingly encourage a vulnerable person to commit suicide, and your intent can be proved, you can and should be held accountable for manslaughter.

That’s what social media companies are doing. They aren’t loaning you extremist ideas to help you. That’s a terrible analogy. They’re intentionally serving extreme content to drive you into more and more upsetting spaces, while pretending that there aren’t any consequences for doing so.

jkrtn@lemmy.ml on 20 Mar 2024 14:41 collapse

Oh, it turns out an extension cord has a side use that isn’t related to its primary purpose. What’s the analogous innocuous use of a semiautomatic handgun?

Minotaur@lemm.ee on 20 Mar 2024 15:11 collapse

Self defense? You don’t have to be a 2A diehard to understand that it’s still a legal object. What’s the “innocuous use” of a VPN? Or a torrenting client? Should we imprison everyone who ever sends a link about one of these to someone who seems interested in their use?

jkrtn@lemmy.ml on 20 Mar 2024 15:32 collapse

You’re deliberately ignoring the point that the primary use of a semiautomatic pistol is killing people, whether self-defense or mass murder.

Should you be culpable for giving your brother an extension cord if he lies that it is for the porch? Not really.

Should you be culpable for giving your brother a gun if he lies that he needs it for self defense? IDK the answer, but it’s absolutely not equivalent.

It is a higher level of responsibility, you know lives are in danger if you give them a tool for killing. I don’t think it’s unreasonable if there is a higher standard for loaning it out or leaving it unsecured.

Minotaur@lemm.ee on 20 Mar 2024 15:38 collapse

“Sorry bro. I’d love to go target shooting with you, but you started taking Vynase 6 months ago and I’m worried if you blow your brains out the state will throw me in prison for 15 years”.

Besides, youre ignoring the point. This article isn’t about a gun, it’s about basically “this person saw content we didn’t make on our website”. You think that wont be extended to general content sent from a person to another? That if you send some pro-Palestine articles to your buddy and then a year or two later your buddy gets busted at an anti-Zionist rally and now you’re a felon because you enabled that? Boy, that would be an easy way for some hypothetical future administrations to control speech!!

You might live in a very nice bubble, but not everyone will.

jkrtn@lemmy.ml on 20 Mar 2024 15:52 collapse

So you need a strawman argument transitioning from loaning a weapon unsupervised to someone we know is depressed. Now it is just target shooting with them, so distancing the loan aspect and adding a presumption of using the item together.

This is a side discussion. You are the one who decided to write strawman arguments relating guns to extension cords, so I thought it was reasonable to respond to that. It seems like you’re upset that your argument doesn’t make sense under closer inspection and you want to pull the ejection lever to escape. Okay, it’s done.

The article is about a civil lawsuit, nobody is going to jail. Nobody is going to be able to take a precedent and sue me, an individual, over sharing articles to friends and family, because the algorithm is a key part of the argument.

Minotaur@lemm.ee on 20 Mar 2024 16:15 collapse

Yeah man. Even if you loan it to them you shouldn’t be charged.

Lmfao okay yeah sure man. No one is this year. See you in 10. I know it’s easy to want to retreat to kind of a naive “this would never happen to ME!” worldview, and yeah. It probably won’t. But you have to consider all the innocent people it unjustly will happen to in coming years.

Also, not what a strawman is. You’re not really good at this.

Also you still can’t respond to anything not related to guns. All those VPN and torrenting points went right over your head huh? Convenient. When you get busted for talking about how to store “several TB of photos” to some guy that turns out to be hoarding CP I hope the “assisted in preserving pedophilic content” charge rests easy on you

jkrtn@lemmy.ml on 20 Mar 2024 16:28 collapse

You’re really deluded into thinking you’re correct and that your strawmen are good arguments. “If we do anything at all about this, then extension cords will be illegal,” really wet sobbing.

“If this civil lawsuit is allowed to proceed then we are already under 1984’s Big Brother police state, they are coming for you,” wild. Your imagination is a very frightening place. You feel threatened by so many things. Must be hard.

Why would I participate in your side quests? You like writing strawmen, have fun with it on your own.

Minotaur@lemm.ee on 20 Mar 2024 16:35 collapse

“Why would I participate in a conversation about the very real slippery slope of vague, easily exploited criminal rulings? That way I would have to think about it.”

jkrtn@lemmy.ml on 20 Mar 2024 16:45 collapse

“Vague” “easily exploited” “criminal” all doing a lot of work here, but it’s good that you recognize your own words are a slippery slope fallacy surrounded by strawmen.

So frightened. I hope you can get some help and feel better.

Minotaur@lemm.ee on 20 Mar 2024 16:56 collapse

Holy fuck dude, go back to Reddit if all you can do is quote the “logical fallacies!!!” Infographic you have saved in your terabytes of photos. I can (and have) engage with other people who actually want to talk about complex issues here without you

jkrtn@lemmy.ml on 20 Mar 2024 17:14 collapse

"Everyone who disagrees with me has ‘terabytes of files.’"

  • A guy who like serious and complex discussions
Minotaur@lemm.ee on 20 Mar 2024 17:16 collapse

You literally do though. You have a post saying you do. What are you talking about???

jkrtn@lemmy.ml on 20 Mar 2024 17:54 collapse

Hold up, let’s get this straight: you’re accusing me of redditor behavior, but this discussion has enraged you so much that you went looking through the comment history for anything you could possibly use as an ad hominem? Sometimes it is absolutely the case that every accusation is a confession.

There’s some grass outside, man, check it out.

Minotaur@lemm.ee on 20 Mar 2024 18:04 collapse

You are STILL looking at the chart of logical fallacies 😂😂😂

Arbiter@lemmy.world on 20 Mar 2024 05:32 next collapse

Yeah, but algorithmic delivery of radicalizing content seems kinda evil though.

WarlordSdocy@lemmy.world on 20 Mar 2024 05:52 next collapse

I think the distinction here is between people and businesses. Is it the fault of people on social media for the acts of others? No. Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes. The blame here is on the social medias for not doing more to stop the spread of this kind of content. Because yes even though that won’t stop this kind of content from existing making it harder to access and find will at least reduce the number of people who will go down this path.

rambaroo@lemmynsfw.com on 20 Mar 2024 09:51 next collapse

I agree, but I want to clarify. It’s not about making this material harder to access. It’s about not deliberately serving that material to people who weren’t looking it up in the first place in order to get more clicks.

There’s a huge difference between a user looking up extreme content on purpose and social media serving extreme content to unsuspecting people because the company knows it will upset them.

0x0@programming.dev on 20 Mar 2024 10:04 collapse

Is it the fault of social media for cultivating an environment that radicalizes people into committing mass shootings? Yes.

Really? Then add videogames and heavy metal to the list. And why not most organized religions? Same argument, zero sense. There’s way more at play than Person watches X content = person is now radicalized, unless we’re talking about someone with severe cognitive deficit.

And since this is the US… perhaps add easy access to guns? Nah, that’s totally unrelated.

Chetzemoka@lemmy.world on 20 Mar 2024 10:21 collapse

“Person watches X creative and clearly fictional content” is not analogous in any way to “person watches X video essay crafted to look like a documentary, but actually just full of lies and propaganda”

Don’t be ridiculous

0x0@programming.dev on 20 Mar 2024 10:35 collapse

So it’s the severe cognitive deficit. Ok. Watching anything inherently bad and thinking it’s ok to do so becauses it seems legit… that’s ridiculous.

Chetzemoka@lemmy.world on 20 Mar 2024 14:56 collapse

I mean, yes. People are stupid. That’s why we have safety regulations. This court case is about a lack of safety regulations.

rambaroo@lemmynsfw.com on 20 Mar 2024 09:24 next collapse

I don’t think you understand the issue. I’m very disappointed to see that this is the top comment. This wasn’t an accident. These social media companies deliberately feed people the most upsetting and extreme material they can. They’re intentionally radicalizing people to make money from engagement.

They’re absolutely responsible for what they’ve done, and it isn’t “by proxy”, it’s extremely direct and deliberate. It’s long past time that courts held them liable. What they’re doing is criminal.

rbesfe@lemmy.ca on 20 Mar 2024 12:07 next collapse

Proving this “intent to radicalize” in court is impossible. What evidence exists to back up your claim beyond a reasonable doubt?

Kalysta@lemmy.world on 20 Mar 2024 19:26 collapse

The algorithms themselves. This decision opens the algorithms up to discovery and now we get to see exactly how various topics are weighted. These companies will sink or swim by their algorithms.

Minotaur@lemm.ee on 20 Mar 2024 12:21 collapse

I do. I just very much understand the extent that the justice system will take decisions like this and utilize them to accuse any person or business (including you!) of a crime that they can then “prove” they were at fault for.

Socsa@sh.itjust.works on 20 Mar 2024 10:31 next collapse

This wasn’t just a content issue. Reddit actively banned people for reporting violent content too much. They literally engaged with and protected these communities, even as people yelled that they were going to get someone hurt.

jumjummy@lemmy.world on 20 Mar 2024 19:35 collapse

And ironically the gun manufacturers or politicians who support lax gun laws are not included in these “nets”. A radicalized individual with a butcher knife can’t possibly do as much damage as one with a gun.

scottmeme@sh.itjust.works on 20 Mar 2024 01:59 next collapse

Excuse me what in the Kentucky fried fuck?

As much as everyone says fuck these big guys all day this hurts everyone.

athos77@kbin.social on 20 Mar 2024 04:04 collapse

I agree with you, but ... I was on reddit since the Digg exodus. It always had it's bad side (violentacrez, jailbait, etc), but it got so much worse after GamerGate/Ellen Pao - the misogyny became weaponized. And then the alt-right moved in, deliberately trying to radicalize people, and we worked so. fucking. hard to keep their voices out of our subreddits. And we kept reporting users and other subreddits that were breaking rules, promoting violence and hatred, and all fucking spez would do is shrug and say, "hey it's a free speech issue", which was somewhere between "hey, I agree with those guys" and "nah, I can't be bothered".

So it's not like this was something reddit wasn't aware of (I'm not on Facebook or YouTube). They were warned, repeatedly, vehemently, starting all the way back in 2014, that something was going wrong with their platform and they need to do something. And they deliberately and repeatedly choose to ignore it, all the way up to the summer of 2021. Seven fucking years of warnings they ignored, from a massive range of users and moderators, including some of the top moderators on the site. And all reddit would do is shrug it's shoulders and say, "hey, free speech!" like it was a magic wand, and very occasionally try to defend itself by quoting it's 'hate speech policy', which they invoke with the same regular repetitiveness and 'thoughts and prayers' inaction as a school shooting brings. In fact, they did it in this very article:

In a statement to CNN, Reddit said, “Hate and violence have no place on Reddit. Our sitewide policies explicitly prohibit content that promotes hate based on identity or vulnerability, as well as content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or group of people. We are constantly evaluating ways to improve our detection and removal of this content, including through enhanced image-hashing systems, and we will continue to review the communities on our platform to ensure they are upholding our rules.”

As someone who modded for a number of years, that's just bullshit.

Edit: fuck spez.

FunkPhenomenon@lemmy.zip on 20 Mar 2024 04:58 next collapse

hate and violence are bad unless we make money, then its ok.

TimLovesTech@badatbeing.social on 20 Mar 2024 05:20 collapse

The first thing that came to mind when I saw Reddit was The_Donald.

Binthinkin@kbin.social on 20 Mar 2024 14:40 collapse

Yep that’s how the Nazis work on every site. The question is who lets them on these sites so easily to do this work on society. And why do sites fight for them to stay? Are Nazis high up in government? Is it the wealthy? Probably something like that.

BirdEnjoyer@kbin.social on 20 Mar 2024 17:59 collapse

Part of the reason they get so high up on nerd sites (And Reddit at least started as a nerd site) is that they hunger for power, and the right people are too shy to seek power themselves.

This would all be greatly relieved if communities asked for communities to nominate other members, and asked for the type of folks who are the types who would mostly only consider the position of asked/ or if they were write-ins.

People with the capacity but are looked over because they maybe lack the ego or self confidence to take such power.

This works especially well in smaller communities under 4K users or so, which kinda falls apart in our Big Internet world, sadly...

muntedcrocodile@lemmy.world on 20 Mar 2024 02:08 next collapse

What an excellent presedent to set cant possibly see how this is going to become authoritarian. Ohh u didnt report someone ur also guilty cant see any problems with this.

KoboldCoterie@pawb.social on 20 Mar 2024 02:13 next collapse

Ohh u didnt report someone ur also guilty cant see any problems with this.

That’s… not what this is about, though?

“However, plaintiffs contend the defendants’ platforms are more than just message boards,” the court document says. “They allege they are sophisticated products designed to be addictive to young users and they specifically directed Gendron to further platforms or postings that indoctrinated him with ‘white replacement theory’,” the decision read.

This isn’t about mandated reporting, it’s about funneling impressionable people towards extremist content.

wagesj45@kbin.run on 20 Mar 2024 02:22 next collapse

That means that the government is injecting itself on deciding what "extremist" is. I do not trust them to do that wisely. And even if I did trust them, it is immoral for the state to start categorizing and policing ideologies.

givesomefucks@lemmy.world on 20 Mar 2024 02:25 next collapse

Do you understand you’re arguing for violent groups instigating a race war?

Like, even if you’re ok with white people doing it, you’re also saying ISIS, MS13, any fucking group can’t be labeled violent extremists…

Some “ideologies” need to be fucking policed

bigMouthCommie@kolektiva.social on 20 Mar 2024 02:27 next collapse

anarchists have had to deal with this for over a century. the state can go fuck itself.

wagesj45@kbin.run on 20 Mar 2024 02:27 next collapse

Some "ideologies" need to be fucking policed

Someone wants to start with yours, and they have more support than you know. Be careful what you wish for.

snooggums@midwest.social on 20 Mar 2024 02:42 collapse

Guess we shouldn’t ever do anything about anything, ever.

wagesj45@kbin.run on 20 Mar 2024 02:54 collapse

Big difference between policing actions and policing thoughts. Declaring some thoughts as verboten and subject to punishment or liability is bad.

VirtualOdour@sh.itjust.works on 20 Mar 2024 03:32 collapse

It’s insane you’re being downvoted by people who would be the first ones silenced.

You really think they’re going to use this for himophobes and racists instead of anyone calling for positive socia6 change?

Did you not see any of history?

muntedcrocodile@lemmy.world on 20 Mar 2024 03:39 collapse

Ur missing the point violence should absolutly be policed. Words ideas ideology hell no let isis, ms13, the communists, the nazis, the vegans etc etc etc say what they want. They are all extremists by some definition let them discuss let them argue and the second someone does something violent lock em for the rest of their lives simple.

What you are suggesting is the policing of ideology to prevent future crime their is an entire book about where that leads to said book simply calls this concept thought crime.

abeorch@lemmy.ml on 20 Mar 2024 02:53 next collapse

That is generally what Governments do. They write laws that say … you can do this but not that. If you do this thats illegal and you will be convicted. Otherwise you wouldnt be able to police things like Mafia and drug cartels. Even in the US their freedom of speech to conspire to committe crimes is criminalised. There is no difference between that and politically motivated ‘extremists’ who conspire to commit crimes. The idealogy is not criminalised the acts that groups plan or conduct are. You are totally fine saying . I dont like x group.

What its not ok to say is . Lets go out and kill people from x.group.

The problem is that social media sites use automated processes to decide which messages to put in front of users in the fundamentally same way that a newspaper publisher decides which letters to the editor they decide to put in their newspaper.

Somehow though Tech companies have argued that because their is no limit on how many posts they can communicate amd hence theoretically they arent deciding what they put in and what they done, that their act of putting some at the top of people’s lists so they are seen is somehow different to the act of the newspaper publisher including a particular letter or not …but the outcome is the same The letter or post is seen by people or not.

Tech companies argue they are just a commutation network but I never saw a telephone, postal or other network that decided which order you got your phone calls, letters or sms messages. They just deliver what is sent in the order it was sen.

commercial social media networks are publishers with editorial control - editorial control is not only inclusion/exclusion but also prominence

There is a fundamental difference in Lemmy or Mastodon in that those decisions (except for any moderation by individual server admins) dont promote or demote any post so therefore dont have any role in whether a user sees a post or not.

zeluko@kbin.social on 20 Mar 2024 04:41 next collapse

umm.. isnt the government or rather the judikative already deciding what extremist is?
How would specifically this be different?

I can understand the problems thos causes for the platforms, but the government injecting decisions is something you focus on?
Not to forget the many other places they inject themselves.. one could say your daily lifes because.. careful now.. you live in the country with a government, whaaat?

520@kbin.social on 20 Mar 2024 08:15 collapse

The government is already the one who makes that decision. The only thing new here is a line being drawn with regards to social media's push towards addiction and echo-chamberism.

Fester@lemm.ee on 20 Mar 2024 02:34 next collapse

And they profit from it. That’s mentioned there too, and it makes it that much more infuriating. They know exactly what they’re doing, and they do it on purpose, for money.

And at the end of the day, they’ll settle (who are the plaintiffs? Article doesn’t say) or pay some relatively inconsequential amount, and they’ll still have gained a net benefit from it. Another case of cost-of-doing-business.

Would’ve been free without the lawsuit even. Lives lost certainly aren’t factored in otherwise.

Kraiden@kbin.run on 20 Mar 2024 02:36 next collapse

Youtube Shorts is the absolute worst for this. Just recently it's massively trying to push transphobic BS at me, and I cannot figure out why. I dislike, report and "do not recommend this channel" every time, and it just keeps shoving more at me. I got a fucking racist church sermon this morning. it's broken!

shalafi@lemmy.world on 20 Mar 2024 03:00 next collapse

I am not discounting anyone’s experience. I am not saying this isn’t happening. But I don’t see it.

LiberalGunNut™ here! You would think watching gun related videos would lead me down a far-right rabbit hole. Here’s my feed ATM.

Meh. History, gun comparisons, chemistry, movies, whatever. Nothing crazy. (Don’t watch Brandon any longer, got leaning too right, too political. Video’s about his bid for a Congressional seat in Texas. Not an election conspiracy thing. Don’t care.)

If anyone can help me understand, I’m listening. Maybe I shy away from the nutcase shit so hard that YouTube “gets” me? Honestly don’t get it.

Kraiden@kbin.run on 20 Mar 2024 03:04 collapse

So that looks like main long form content. I'm specifically talking about youtube shorts which is Google's version of TikTok

VirtualOdour@sh.itjust.works on 20 Mar 2024 03:26 next collapse

Don’t dislike it just hit do not recommend, also don’t open comments - honestly the best way is just to skip past as fast as you can when you set one, the lower time with it on your screen YNt less the algo thinks you want it.

I never really see that on YouTube unless I’ve been on related topics recently and it goes pretty quick when you don’t interact. Yes it’s shifty but they’re working on a much better system using natural language with an llm but it’s a complex problem

muntedcrocodile@lemmy.world on 20 Mar 2024 03:32 collapse

Imagine watchibg let alone even having the option for shorts. Get newpipe there is a sponsorblock version on fdroid no shorts no google tracking no nonsence u dont get comments tho but whatever. It also supports peertube which is nice.

Report for what? Sure disagree with them about their bullshit but i dont see why u need to report someone just cos u disagree with their opinions.

Kraiden@kbin.run on 20 Mar 2024 04:44 collapse

Imagine watchibg let alone even having the option for shorts.

I like shorts for the most part

Report for what?

Misinformation and hatespeech mostly. They have some crazy, false pseudoscience to back their "opinions" and they express them violently. Like it or not, these videos "promote hatred against a protected group" and are expressly against youtube TOS. Reporting them is 100% appropriate.

muntedcrocodile@lemmy.world on 20 Mar 2024 15:22 collapse

I can strongly reccommwnd stop watching ahort form content it has been proven to caise all sorts of mental issues.

Fair. Also what is a “protected group” what makes it any different from any other grouping?

muntedcrocodile@lemmy.world on 20 Mar 2024 03:54 collapse

U can make any common practice and pillar of capitalism sound bad by using the words impressionable and extremist.

If we remove that it become: funnelling a market towards the further consumption of your product. I.e. marketing

And yes of cause the platforms are designed to be addictive and are effective at indoctranation but why is that only a problem for certain ideologies shouldnt we be stopping all ideologies from practicing indoctranation of impressionable people should we not be guiding people to as many viewpoints as possible to teach them to think not to swallow someone elses ideas and spew them back out.

I blame Henry Ford for this whole clusterfuck he lobbied the education system to manufacture an obedient consumer market and working class that doesnt think for itself but simply swallows what its told. The education system is the problem anything else is treating the symptoms not the disease.

KoboldCoterie@pawb.social on 20 Mar 2024 04:11 collapse

If we remove that it become: funnelling a market towards the further consumption of your product. I.e. marketing

And if a company’s marketing campaign is found to be indirectly responsible for a kid shooting up a grocery store, I’m sure we’ll be seeing a repeat of this with that company being the one with a court case being brought against them, what even is this argument?

muntedcrocodile@lemmy.world on 20 Mar 2024 15:19 collapse

Isnt the entire gun market indirectly responsible, what about the food the shooters ate? Cant we use the same logic to prssecute anyone of any religion cos most of the religiouse texts support the killing of some group of people.

Its convenient to ask what the argument is when u ignore 60% of it

KoboldCoterie@pawb.social on 20 Mar 2024 15:27 collapse

Did you even read the article we’re discussing, or are you just reading the comments and getting mad?

  1. No decision has been made. This is simply a judge denying the companies’ motion to have this thrown out before going to trial.
  2. This is very much different than “the gun market” being indirectly responsible. This is the equivalent of “the gun market” constantly sending a person pamphlets, calling them, emailing them, whatever else, with propaganda until they ultimately decided to act on it. If that was happening, I think we’d be having the same conversation about that, and whether they should be held accountable.
  3. Whether they’re actually responsible or not (or whether any group is) can be determined in court following all the usual methods. A company getting to say “That’s ridiculous, we’re above scrutiny” is dangerous, and that’s effectively what they were trying to do (which was denied by this judge.)
Drusas@kbin.run on 20 Mar 2024 02:15 next collapse

You could make a good point with better spelling, grammar, and word choice.

muntedcrocodile@lemmy.world on 20 Mar 2024 03:41 collapse

Yes i could. I could spend the extra 30seconds fixing it or i could not bother and still have my point comprehendable.

FunkPhenomenon@lemmy.zip on 20 Mar 2024 05:00 collapse

unpossible

[deleted] on 20 Mar 2024 02:32 collapse

.

hal_5700X@sh.itjust.works on 20 Mar 2024 02:18 next collapse

Here comes more censorship from Big Tech. 🤦‍♂️

RainfallSonata@lemmy.world on 20 Mar 2024 03:29 next collapse

Platforms should be held responsible for the content their users publish on them, full stop.

Lath@kbin.earth on 20 Mar 2024 06:44 next collapse

So if some random hacker takes over your network connection and publishes illegal content which then leads back to you, you should be held responsible. It's your platform after all.

pendingdeletion@lemmy.world on 20 Mar 2024 13:40 collapse

If it’s your server, then yes you should have responsibility with how you deal with said content.

Lath@kbin.earth on 20 Mar 2024 15:12 collapse

How much of a responsibility? Is a token effort enough or should you be charged with a crime for not trying hard enough?

0x0@programming.dev on 20 Mar 2024 10:16 next collapse

Content creators should be held responsible for their content. Platforms are mere distributors, in general terms, otherwise you’re blaming the messenger.

Specific to social media (and television) yes, they bank on hate, it’s known - so don’t use them or do so with that ever dwindling human quality called critical thinking. Wanting to hold them accountable for murder is just dismissing the real underlying issues, like unsupervised impressionable people watching content, easy access to guns, human nature itself, societal issues…

conciselyverbose@sh.itjust.works on 20 Mar 2024 22:05 collapse

Then user generated content completely disappears.

Without the basic protection of section 230, it’s not possible to allow users to exist or interact with anything. I’m not sure you could even pay for web hosting without it.

atrielienz@lemmy.world on 20 Mar 2024 03:31 next collapse

So, I can see a lot of problems with this. Specifically the same problems that the public and regulating bodies face when deciding to keep or overturn section 230. Free speech isn’t necessarily what I’m worried about here. Mostly because it is already agreed that free speech is a construct that only the government is actually beholden to. Message boards have and will continue to censor content as they see fit.

Section 230 basically stipulates that companies that provide online forums (Meta, Alphabet, 4Chan etc) are not liable for the content that their users post. And part of the reason it works is because these companies adhere to strict guidelines in regards to content and most importantly moderation.

Section 230©(2) further provides “Good Samaritan” protection from civil liability for operators of interactive computer services in the good faith removal or moderation of third-party material they deem “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

Reddit, Facebook, 4Chan et all do have rules and regulations they require their users to follow in order to post. And for the most part the communities on these platforms are self policing. There just aren’t enough paid moderators to make it work otherwise.

That being said, the real problem is that this really kind of indirectly challenges section 230. Mostly because it very barely skirts around whether the relevant platforms can themselves be considered publishers, or at all responsible for the content the users post and very much attacks how users are presented with content to keep them engaged via algorithms (which is directly how they make their money).

Even if the lawsuits fail, this will still be problematic. It could lead to draconian moderation of what can be posted and by whom. So now all race related topics regardless of whether they include hate speech could be censored for example. Politics? Censored. The discussion of potential new laws? Censored.

But I think it will be worse than that. The algorithm is what makes the ad space these companies sell so valuable. And this is a direct attack on that. We lack the consumer privacy protections to protect the public from this eventuality. If the ad space isn’t valuable the data will be. And there’s nothing stopping these companies from selling user data. Some of them already do. What these apps do in the background is already pretty invasive. This could lead to a furthering of that invasive scraping of data. I don’t like that.

That being said there is a point I agree with. These companies literally do make their algorithm addictive and it absolutely will push content at users. If that content is of an objectionable nature, so long as it isn’t outright illegal, these companies do not care. Because they do gain from it monetarily.

What we actually need is data privacy protections. Holding these companies accountable for their algorithms is a good idea. But I don’t agree that this is the way to do that constructively. It would be better to flesh out 230 as a living document that can change with the times. Because when it was written the Internet landscape was just different.

What I would like to see is for platforms to moderate content posted and representing itself as fact. We don’t see that nearly enough on places like reddit. Users can post anything as fact and the echo chambers will rally around it if they believe it. It’s not really incredibly difficult to radicalise a person. But the platforms aren’t doing that on purpose. The other users are, and the algorithms are helping them.

yamanii@lemmy.world on 20 Mar 2024 04:53 collapse

Moderation is already draconian, interact with any gen Z and you gonna know what goon, corn, unalive, (crime) in Minecraft, actually mean.

These aren’t slangs, this is like a second language developed to evade censorship from those platforms, things will only get worse.

atrielienz@lemmy.world on 20 Mar 2024 12:24 collapse

It’s always been that way though. Back in the day on Myspace or MSN chatrooms there were whole lists of words that were auto censored and could result in a ban (temp or permanent). We literally had whole lists of alternates to use. You couldn’t say sex, or kill back then either. The difference is the algorithm. I acknowledge in my comment that these platforms already censor things they find objectionable. Part of that is to keep Section 230 as it is. A perhaps more relevant part of it is to keep advertisers happy so they continue to buy ad space. A small portion of it may even be to keep the majority of the user base happy because users who don’t agree with the supposed ideologies on a platform will leave it and that’s less eyeballs on ads.

Mastengwe@lemm.ee on 20 Mar 2024 03:57 next collapse

It’s ALWAYS someone else’s fault.

FunkPhenomenon@lemmy.zip on 20 Mar 2024 04:55 next collapse

eh… anyone can be “radicalized” by anything. is anyone suing Mecca when islamic fundamentalists jihad someone/something? is anyone suing the Catholic church because of christian fundamentalists doing the same thing?

holding tech companies liable because some crazy dumbshit did a bad thing is disingenuous at best. Judge’s ruling isnt going to stand.

Not_mikey@slrpnk.net on 20 Mar 2024 06:29 next collapse

Judge hasn’t ruled yet, this was just them saying the case has some merit and won’t be dismissed. This will go to trial, after which the judge will make their ruling.

AtmaJnana@lemmy.world on 20 Mar 2024 12:06 collapse

Reading comprehension is important. As is a highschool level understanding of how our legal system works.

Nomad@infosec.pub on 20 Mar 2024 06:19 next collapse

Nice, now do all regigions and churches next

Not_mikey@slrpnk.net on 20 Mar 2024 07:01 next collapse

Sweet, I’m sure this won’t be used by AIPAC to sue all the tech companies for causing October 7th somehow like unrwa and force them to shutdown or suppress all talk on Palestine. People hearing about a genocide happening might radicalize them, maybe we could get away with allowing discussion but better safe then sorry, to the banned words list it goes.

This isn’t going to end in the tech companies hiring a team of skilled moderators who understand the nuance between passion and radical intention trying to preserve a safe space for political discussion, that costs money. This is going to end up with a dictionary of banned and suppressed words.

glovecraft@infosec.pub on 20 Mar 2024 07:19 next collapse

This is going to end up with a dictionary of banned and suppressed word

Do you have some examples?

Alpha71@lemmy.world on 20 Mar 2024 08:40 collapse

It’s already out there. For example you can’t use the words “Suicide” or “rape” or “murder” in YouTube, TikTok etc. even when the discussion is clearly about trying to educate people. Heck, you can’t even mention Onlyfans on Twitch…

Makhno@lemmy.world on 20 Mar 2024 09:42 collapse

Heck, you can’t even mention Onlyfans on Twitch…

They don’t like users mentioning their direct competition

0x0@programming.dev on 20 Mar 2024 09:59 collapse

Plus more demands for brackdoors in encryption.

Simulation6@sopuli.xyz on 20 Mar 2024 09:12 next collapse

Add Fox news and Trump rallies to the list.

0x0@programming.dev on 20 Mar 2024 09:58 collapse

Don’t forget Marilyn Manson and videogames.

/s

cophater69@lemm.ee on 20 Mar 2024 11:51 next collapse

Marilyn Manson led a charge to overthrow the government??

Passerby6497@lemmy.world on 20 Mar 2024 12:15 collapse

Doubt it. Last time I saw him on stage, he made trump look like an eloquent speaker.

[deleted] on 20 Mar 2024 12:17 collapse

.

Passerby6497@lemmy.world on 20 Mar 2024 12:23 collapse

Because I don’t like than an artist I once enjoyed is a drugged out and drunken mess? Based on the reaction it definitely sounds like it.

Didn’t think that many lemmings likes washed up has been metal acts, but to each their own I guess.

cophater69@lemm.ee on 20 Mar 2024 13:34 collapse

I actually responded to the wrong person and I apologize. I’ve actually heard the same thing about MM lately – just washed-up and sad.

TheDarksteel94@sopuli.xyz on 20 Mar 2024 12:03 next collapse

Idk why you’re getting downvoted for an obvious joke lol

cophater69@lemm.ee on 20 Mar 2024 12:09 collapse

Because it’s not funny or relevant and is an attempt to join two things - satanic panic with legal culpability in social media platforms.

TheBat@lemmy.world on 20 Mar 2024 12:19 collapse

Not relevant?

Metal music and videos games have been blamed for mass shootings before.

allcopsarebad@lemm.ee on 20 Mar 2024 12:35 collapse

And this is neither of those things. This is something much more tangible, with actual science behind it.

TheBat@lemmy.world on 20 Mar 2024 13:45 collapse

Yes, that exactly is the point.

How people who supposedly care for children’s safety are willing to ignore science and instead choose to hue and cry about bullshit stuff they perceive (or told by their favourite TV personality) as evil.

Have you got it now? Or should I explain it further?

Didn’t expect Lemmy to have people who lack reading comprehension.

isles@lemmy.world on 20 Mar 2024 15:49 collapse

People don’t appreciate having spurious claims attached to their legitimate claims, even in jest. It invokes the idea that since the previous targets of blame were false that these likely are as well.

0x0@programming.dev on 21 Mar 2024 09:34 collapse

They’re all external factors. Music and videogames have been (wrongly, imo) blamed in the past. Media, especially nowadays, is probably more “blameable” than music and games, but i still think it’s bs to use external factors as an excuse to justify mass shootings.

isles@lemmy.world on 21 Mar 2024 13:58 collapse

What are the internal factors of a person that are not influenced by the environment or culture?

0x0@programming.dev on 20 Mar 2024 10:00 next collapse

It’s never the parents, is it?

geogle@lemmy.world on 20 Mar 2024 10:28 next collapse

Ask those parents in the Michigan case

ButtCheekOnAStick@lemmy.world on 20 Mar 2024 11:22 collapse

Ask the parents of the Menendez brothers, oh wait.

PoliticalAgitator@lemmy.world on 20 Mar 2024 12:58 next collapse

You mean the “responsible gun owners” who don’t properly secure their weapons from a child?

echodot@feddit.uk on 20 Mar 2024 13:47 collapse

I couldn’t work this out from the article is it the parents raising this suit or the victims families?

Socsa@sh.itjust.works on 20 Mar 2024 10:26 next collapse

Please let me know if you want me to testify that reddit actively protected white supremacist communities and even banned users who engaged in direct activism against these communities

FenrirIII@lemmy.world on 20 Mar 2024 11:47 next collapse

I was banned for activism against genocide. Reddit is a shithole.

fine_sandy_bottom@discuss.tchncs.de on 20 Mar 2024 12:33 next collapse

Well yeah it is but… what did you think would happen?

misspacific@lemmy.blahaj.zone on 20 Mar 2024 13:46 collapse

i was banned for similar reasons.

seems like a lot of mods just have the ability to say whatever about whoever and the admins just nuke any account they target.

Ragnarok314159@sopuli.xyz on 20 Mar 2024 18:41 collapse

I have noticed a massive drop in the quality of posting in Reddit over the last year. It was on a decline, but there was a massive drop off.

It’s anecdotal to what I have read off Lemmy, but a lot of high Karma accounts have been nuked due to mods and admins being ridiculously over zealous in handing out permabans.

jkrtn@lemmy.ml on 20 Mar 2024 14:03 collapse

Send your evidence to the lawyers, couldn’t hurt.

[deleted] on 20 Mar 2024 11:10 next collapse

.

RatBin@lemmy.world on 20 Mar 2024 11:19 next collapse

Completely different cases, questionable comparison;

  • social media are the biggest cultural industry at the moment, albeit a silent and unnoticed one. Cultural industries like this are means of propaganda, information and socilalization, all of which is impactful and heavily personal and personalised for everyone’s opinion.

  • thus the role of such an impactul business is huge and can move opinions and whole movements, the choices that people takes are driven by their media consumption and communities they take part in.

  • In other words, policy, algorhitms, GUI are all factors that drive the users to engage in speific ways with harmful content.

RealFknNito@lemmy.world on 20 Mar 2024 11:28 collapse

biggest cultural industry at the moment

I wish you guys would stop making me defend corporations. Doesn’t matter how big they are, doesn’t matter their influence, claiming that they are responsible for someone breaking the law because someone else wrote something that set them off and they, as overlords, didn’t swoop in to stop it is batshit.

Since you don’t like those comparisons, I’ll do one better. This is akin to a man shoving someone over a railing and trying to hold the landowners responsible for not having built a taller railing or more gradual drop.

You completely fucking ignore the fact someone used what would otherwise be a completely safe platform because another party found a way to make it harmful.

polocy and algorithm are factors that drive users to engage

Yes. Engage. Not in harmful content specifically, that content just so happens to be the content humans react to the strongest. If talking about fields of flowers drove more engagement, we’d never stop seeing shit about flowers. It’s not them maliciously pushing it, it’s the collective society that’s fucked.

The solution is exactly what it has always been. Stop fucking using the sites if they make you feel bad.

RatBin@lemmy.world on 20 Mar 2024 11:42 collapse

Again, no such a thing as a neutral space or platform, case in point, reddit with its gated communities and the lack of control over what people does with the platform is in fact creating safe spaces for these kind of things. This may not be inentional, but it ultimately leads towards the radicalization of many people, it’s a design choice followed by the internal policy of the admins who can decide to let these communities be on one of the mainstream websites. If you’re unsure about what to think, delving deep into these subreddits has the effect of radicalising you, whereas in a normal space you wouldn’t be able o do it as easily. Since this counts as engagement, reddit can suggest similar forums, leading via algorhitms to a path of radicalisation. This is why a site that claims to be neutra is’t truly neutral.

This is an example of alt-right pipeline that reddit succesfully mastered:

The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups (en.wikipedia.org/wiki/Alt-right_pipeline)

And yet you keep comparing cultural and media consumption to a physical infrastructure, which is regulated as to prevent what you mentioned, an unsafe management of the terrain for instace. So taking your examples as you wanted, you may just prove that regulations can in fact exist and private companies or citizens are supposed to follow them. Since social media started to use personalisation and predictive algorhitms, they also behave as editors, handling and selecting the content that users see. Why woul they not be partly responsible based on your argument?

RealFknNito@lemmy.world on 20 Mar 2024 11:54 collapse

No such thing as neutral space

it may not be intentional, but

They can suggest similar [communities] so it can’t be neutral

My guy, what? If all you did was look at cat pictures you’d get communities to share fucking cat pictures. These sites aren’t to blame for “radicalizing” people into sharing cat pictures any more than they are to actually harmful communities. By your logic, lemmy can also radicalize people. I see anarchist bullshit all the time, had to block those communities and curate my own experience. I took responsibility and instead of engaging with every post that pissed me off, removed that content or avoided it. Should the instance I’m on be responsible for not defederating radical instances? Should these communities be made to pay for radicalizing others?

Fuck no. People are not victims because of the content they’re exposed to, they choose to allow themselves to become radical. This isn’t a “I woke up and I really think Hitler had a point.” situation, it’s a gradual decline that isn’t going to be fixed by censoring or obscuring extreme content. Companies already try to deal with the flagrant forms of it but holding them to account for all of it is truly and completely stupid.

Nobody should be responsible because cat pictures radicalized you into becoming a furry. That’s on you. The content changed you and the platform suggesting that content is not malicious nor should it be held to account for that.

cophater69@lemm.ee on 20 Mar 2024 12:03 next collapse

This is an extremely childish way of looking at the world, IT infrastructure, social media content algorithms, and legal culpability.

RealFknNito@lemmy.world on 20 Mar 2024 12:06 collapse

As neutral platforms that will as readily push cat pictures as often it will far right extremism and the only difference is how much the user personally engages with it?

Whatever you say, CopHater69. You’re definitely not extremely childish and radical.

cophater69@lemm.ee on 20 Mar 2024 12:12 collapse

Oh I’m most certainly a radical, but I understand what that means because I got a college degree, and now engineer the internet.

RealFknNito@lemmy.world on 20 Mar 2024 12:15 collapse

I doubt you could engineer a plug into your own asshole but sure, I’ll take your word that you’re not just lying and have expert knowledge on this field yet still refused to engage with the point to sling insults instead.

cophater69@lemm.ee on 20 Mar 2024 12:17 collapse

So triggered

RealFknNito@lemmy.world on 20 Mar 2024 13:16 collapse

Always something about radicals and their need to point out “Ur triggered”

herpaderp@lemmynsfw.com on 20 Mar 2024 12:40 collapse

I’ve literally watched friends of mine descend into far right thinking and I can point to the moment when they started having algorithms suggest content that puts them down a “rabbit hole”

Like, you’re not wrong they were right wing initially but they became the “lmao I’m an unironic fascist and you should be pilled like me” variety over a period of six months or so. Started stock piling guns and etc.

This phenomena is so commonly reported it makes you start wonder where all these people deciding to “radicalize themselves” all at once seemingly came out in droves.

Additionally, these companies are responsible for their content serving algorithms, and if they did not matter for affecting the thoughts of the users: why do propaganda efforts from nation states target their narratives and interests appearing within them if it was not effective? Did we forget the spawn and ensuing fall out of the Arab Spring?

Bonesince1997@lemmy.world on 20 Mar 2024 11:51 next collapse

The examples you came up with hit that last line to a T!

[deleted] on 20 Mar 2024 11:58 next collapse

.

[deleted] on 20 Mar 2024 12:01 collapse

.

[deleted] on 20 Mar 2024 12:04 collapse

.

[deleted] on 20 Mar 2024 12:06 collapse

.

[deleted] on 20 Mar 2024 12:10 collapse

.

[deleted] on 20 Mar 2024 12:13 collapse

.

[deleted] on 20 Mar 2024 12:41 collapse

.

Passerby6497@lemmy.world on 20 Mar 2024 12:18 collapse

Yeah, good thing we don’t have evidence of any social media company’s algorithms radicalizing and promoting more and more extreme content to people.

Could you imagine? Companies actively radicalizing people for money??

RealFknNito@lemmy.world on 20 Mar 2024 13:16 collapse

Fuck it’s almost like they promote things that have high engagement and rage and fear happen to be our most reactive emotions.

Could you imagine? A coincidence without a malicious conspiracy??

[deleted] on 20 Mar 2024 13:17 collapse

.

Zuberi@lemmy.dbzer0.com on 20 Mar 2024 13:14 next collapse

Fuck Reddit, can’t wait to see the IPO burn

Embarrassingskidmark@lemmy.world on 20 Mar 2024 13:38 next collapse

The trifecta of evil. Especially Reddit, fuck Reddit… Facebook too.

echodot@feddit.uk on 20 Mar 2024 13:44 collapse

Facebook will have actively pushed this stuff. Reddit will have just ignored it, and YouTube just feeds your own bubble back to you.

YouTube doesn’t radicalize people, it only increases their existing radicalization, but the process must start elsewhere, and to be completely fair they do put warnings and links to further information on the bottom of questionable videos, and they also delist quite a lot of stuff as well.

I don’t know what’s better to completely block conspiracy theory videos or to allow them and then have other people mock them.

jkrtn@lemmy.ml on 20 Mar 2024 14:02 next collapse

Hard disagree that YouTube doesn’t radicalize people. It’s far too easy to have Ben Shapiro show up in the recommendations.

echodot@feddit.uk on 20 Mar 2024 15:03 collapse

Well I don’t know who that is, my which is my point really. I’m assuming he’s some right wing conspiracy theorist but because I’m not already pre-disposed to listen to that kind of stuff I don’t get it in my recommendations.

Meanwhile Facebook would actively promote that stuff.

afraid_of_zombies@lemmy.world on 20 Mar 2024 20:44 collapse

Well I don’t know who that is,

Consider yourself lucky.

echodot@feddit.uk on 20 Mar 2024 22:45 collapse

Yeah I feel like people are missing my point I don’t know who it is and I don’t get recommended his content.

The only people who get recommended his content are people who are already going to be thinking along those lines and watching videos along those lines.

YouTube does not radicalize people they do it to themselves.

afraid_of_zombies@lemmy.world on 20 Mar 2024 23:04 collapse

Right except people are telling you, repeatedly, that this isn’t true.

ultranaut@lemmy.world on 20 Mar 2024 16:40 next collapse

Why do you believe “the process must start elsewhere”? I’ve literally had YouTube start feeding me this sort of content, which I have no interest in at all and actively try to avoid. It seems very obvious that YouTube is a major factor in inculcating these belief systems in people who would otherwise not be exposed to them without YouTube ensuring they reach an audience.

afraid_of_zombies@lemmy.world on 20 Mar 2024 20:43 collapse

YouTube would hit me hard with religious messaging and rightwing stuff. Which is not at all reflective of what content I want to view.

antidote101@lemmy.world on 20 Mar 2024 13:48 next collapse

Can we stop letting the actions of a few bad people be used to curtail our freedom on platforms we all use.

I don’t want the internet to end up being policed by corporate AIs and poorly implemented bots (looking at you auto-mod).

The internet is already a husk of what it used to be, what it could be. It used to be personal, customisable… Dare I say it; messy and human…

… maybe that was serving a need that now people feel alienated from. Now we live as corporate avatars who risk being banned every time we comment anywhere.

It’s tiresome.

tocopherol@lemmy.dbzer0.com on 20 Mar 2024 13:51 next collapse

Facebook and others actively promote harmful content because they know it drives interactions, I believe it’s possible to punish corps without making the internet overly policed.

tbs9000@lemmy.world on 20 Mar 2024 18:22 collapse

I agree with you in spirit. The most common sentiment I see among the comments is not to limit what people can share but how actively platforms move people down rabbit holes. If there is not action on the part of the platforms to correct for this, they risk regulation which in turn puts freedom of speech at risk.

Phanatik@kbin.social on 20 Mar 2024 13:53 next collapse

I don't understand the comments suggesting this is "guilty by proxy". These platforms have algorithms designed to keep you engaged and through their callousness, have allowed extremist content to remain visible.

Are we going to ignore all the anti-vaxxer groups who fueled vaccine hesitancy which resulted in long dead diseases making a resurgence?

To call Facebook anything less than complicit in the rise of extremist ideologies and conspiratorial beliefs, is extremely short-sighted.

"But Freedom of Speech!"

If that speech causes harm like convincing a teenager walking into a grocery store and gunning people down is a good idea, you don't deserve to have that speech. Sorry, you've violated the social contract and those people's blood is on your hands.

SuperSaiyanSwag@lemmy.zip on 20 Mar 2024 16:22 next collapse

This may seem baseless, but I have seen this from years of experience in online forums. You don’t have to take it seriously, but maybe you can relate. We have seen time and time again that if there is no moderation then the shit floats to the top. The reason being that when people can’t post something creative or fun, but they still want the attention, they will post negative. It’s the loud minority, but it’s a very dedicated loud minority. Let’s say we have 5 people and 4 of them are very creative time and funny, but 1 of them complains all the time. If they make posts to the same community then there is a very good chance that the one negative person will make a lot more posts than the 4 creative types.

Kyatto@leminal.space on 21 Mar 2024 02:37 collapse

Oh absolutely, and making something creative takes days, weeks, months.

Drama, complaining, conspiracy theorizing, and hate-videos take a few minutes to make more than the video itself lasts.

firadin@lemmy.world on 20 Mar 2024 16:32 next collapse

Not just “remain visible” - actively promoted. There’s a reason people talk about Youtube’s right-wing content pipeline. If you start watching anything male-oriented, Youtube will start slowly promoting more and more right-wing content to you until you’re watching Ben Shaprio and Andrew Tate

BeMoreCareful@lemmy.world on 20 Mar 2024 17:12 next collapse

YouTube is really bad about trying to show you right wing crap. It’s overwhelming. The shorts are even worse. Every few minutes there’s some new suggestion for some stuff that is way out of the norm.

Tiktok doesn’t have this problem and is being attacked by politicians?

reverendsteveii@lemm.ee on 20 Mar 2024 17:24 next collapse

it legit took youtube’s autoplay about half an hour after I searched “counting macros” to bring me to american monarchist content

Wogi@lemmy.world on 20 Mar 2024 17:46 collapse

Oh are we doing kings now?

Vote for me for King, I’ll make sure there’s a taco truck on every corner.

captainlezbian@lemmy.world on 20 Mar 2024 21:00 collapse

I’ll vote for you if you make me the Marquess of Michigan

Wogi@lemmy.world on 21 Mar 2024 01:53 collapse

… Deal. But I reserve the right to turn the upper peninsula in to my own personal resort.

captainlezbian@lemmy.world on 21 Mar 2024 02:30 collapse

Deal, but Michigan gets Cleveland then. I feel like Cleveland for the UP is a fair trade

Wogi@lemmy.world on 21 Mar 2024 15:49 collapse

That’s actually a great idea. I’ve always been a fan of the Browns.

Ragnarok314159@sopuli.xyz on 20 Mar 2024 18:35 collapse

I got into painting mini Warhammer 40k figurines during covid, and thought the lore was pretty interesting.

Every time I watch a video, my suggested feed goes from videos related to my hobbies to entirely replaced with red pill garbage. The right wing channels have to be highly profitable to YouTube to funnel people into, just an endless tornado of rage and constant viewing.

Gullible@sh.itjust.works on 20 Mar 2024 20:10 next collapse

The algorithm is, after all, optimized for nothing other than advertisements/time period. So long as the algorithm believes that a video suggestion will keep you on the website for a minute more, it will suggest it. I occasionally wonder about the implications of one topic leading to another. Is WH40k suggested the pipeline by demographics alone or more?

Irritation at suggestions was actually what originally led me to invidious. I just wanted to watch a speech without hitting the “____ GETS DUNKED ON LIKE A TINY LITTLE BITCH” zone. Fuck me for trying to verify information.

r3df0x@7.62x54r.ru on 21 Mar 2024 16:32 collapse

One thing to consider is that conservatives are likely paying for progressives to see their content, and geeks tend to have liberal views and follow the harm principle without many conditions.

Otherwise, it really shows the demographics of the people who play Warhammer. Before my sister transitioned, she played Warhammer and was a socialist but had a lot of really wehraboo interests. She has been talking about getting back into it, but she passes really well and imagines how it would go with the neckbeards.

driving_crooner@lemmy.eco.br on 20 Mar 2024 17:38 next collapse

What about youtube? That had actually paid those people to spread their sick ideas, making the world a worst place and getting rich while doing it.

Phanatik@kbin.social on 22 Mar 2024 00:49 collapse

YouTube will actually take action and has done in most instances. I won't say they're the fastest but they do kick people off the platform if they deem them high risk.

cows_are_underrated@feddit.de on 20 Mar 2024 19:00 next collapse

“But freedom of speech”

If that speech causes harm like convincing a teenager walking into a grocery store and gunning people down is a good idea, you don’t deserve to have that speech.

In Germany we have a very good rule for this(its not written down, but that’s something you can usually count onto). Your freedom ends, where it violates the freedom of others. Examples for this: Everyone has the right to live a healthy life and everyone has the right to walk wherever you want. If I now take my right to walk wherever to want to cause a car accident with people getting hurt(and it was only my fault). My freedom violated the right that the person who has been hurt to life a healthy life. That’s not freedom.

RaoulDook@lemmy.world on 20 Mar 2024 19:58 next collapse

Very reasonable, close to the “Golden Rule” concept of being excellent to each other

Syringe@lemmy.world on 20 Mar 2024 20:34 collapse

In Canada, they have an idea called “right to peace”. It means that you can’t stand outside of an abortion clinic and scream at people because your right to free speech doesn’t exceed that person’s right to peace.

I don’t know if that’s 100% how it works so someone can sort me out, but I kind of liked that idea

afraid_of_zombies@lemmy.world on 20 Mar 2024 20:31 collapse

Ok…don’t complain to me later when the thing you like gets taken down.

roguetrick@lemmy.world on 20 Mar 2024 13:57 next collapse

They’re appealing the denial of motion to dismiss huh? I agree that this case really doesn’t have legs but I didn’t know that was an interlocutory appeal that they could do. They’d win in summary judgement regardless.

Jaysyn@kbin.social on 20 Mar 2024 14:04 next collapse

Good.

There should be no quarter for fascists, violent racist or their enablers.

Conspiracy for cash isn't a free speech issue.

Morefan@retrolemmy.com on 20 Mar 2024 15:42 collapse

for fascists, violent racist or their enablers.

Take a good long look in the mirror (and a dictionary printed before 2005) before you say things like this.

PiratePanPan@lemmy.dbzer0.com on 20 Mar 2024 18:24 next collapse

<img alt="" src="https://lemmy.dbzer0.com/pictrs/image/3f9f2f34-e832-4c2b-9777-773d042138b6.jpeg">

Jaysyn@kbin.social on 20 Mar 2024 18:46 collapse

Fuck off, symp.

Morefan@retrolemmy.com on 20 Mar 2024 21:55 collapse

Glow harder.

Binthinkin@kbin.social on 20 Mar 2024 14:36 next collapse

Goddamn right they do. Meta should be sued to death for the genocides too.

yarr@feddit.nl on 20 Mar 2024 14:42 next collapse

Are the platforms guilty or are the users that supplied the radicalized content guilty? Last I checked, most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves.

KneeTitts@lemmy.world on 20 Mar 2024 14:56 next collapse

most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves

Its their job to block that content before it reaches an audience, but since thats how they make their money, they dont or wont do that. The monetization of evil is the problem, those platforms are the biggest perpetrators.

yarr@feddit.nl on 20 Mar 2024 15:00 collapse

Its their job to block that content before it reaches an audience

The problem is (or isn’t, depending on your perspective) that it is NOT their job. Facebook, YouTube, and Reddit are private companies that have the right to develop and enforce their own community guidelines or terms of service, which dictate what type of content can be posted on their platforms. This includes blocking or removing content they deem harmful, objectionable, or radicalizing. While these platforms are protected under Section 230 of the Communications Decency Act (CDA), which provides immunity from liability for user-generated content, this protection does not extend to knowingly facilitating or encouraging illegal activities.

There isn’t specific U.S. legislation requiring social media platforms like Facebook, YouTube, and Reddit to block radicalizing content. However, many countries, including the United Kingdom and Australia, have enacted laws that hold platforms accountable if they fail to remove extremist content. In the United States, there have been proposals to amend or repeal Section 230 of CDA to make tech companies more responsible for moderating the content on their sites.

ITGuyLevi@programming.dev on 20 Mar 2024 16:21 next collapse

The argument could be made (and probably will be) that they promote those activities by allowing their algorithms to promote that content. Its’s a dangerous precedent to set, but not unlikely given the recent rulings.

FlyingSpaceCow@lemmy.world on 20 Mar 2024 17:15 next collapse

Any precedent here regardless of outcome will have significant (and dangerous) impact, as the status quo is already causing significant harm.

For example Meta/Facebook used to prioritize content that generates an angry face emoji (over that of a “like”) - - as it results in more engagement and revenue.

However the problem still exists. If you combat problematic content with a reply of your own (because you want to push back against hatred, misinformation, or disinformation) then they have even more incentiive to show similar content. And they justify it by saying “if you engaged with content, then you’ve clearly indicated that you WANT to engage with content like that”.

The financial incentives as they currently exist run counter to the public good

joel_feila@lemmy.world on 20 Mar 2024 17:40 collapse

Yeah i have made that argument before. By pushing content via user recommended lists and auto play YouTube becomes a publisher and meeds to be held accountable

hybridhavoc@lemmy.world on 20 Mar 2024 19:43 collapse

Not how it works. Also your use of “becomes a publisher” suggests to me that you are misinformed - as so many people are - that there is some sort of a publisher vs platform distinction in Section 230. There is not.

joel_feila@lemmy.world on 21 Mar 2024 11:58 collapse

Oh no i am aware of that distinction. I just think it needs to go away and be replaced.

Currently sec 230 treats websites as not responsible for user generated content. Example, if I made a video defaming someone I get sued but YouTube is in the clear. But if The New York Times publishes an article defaming someone they get sued not just the writer.

Why? Because NYT published that article but YouTube just hosts it. This publisher platform distinction is not stated in section 230 but it is part of usa law.

hybridhavoc@lemmy.world on 08 Apr 2024 23:37 collapse

This is frankly bizarre. I don’t understand how you can even write that and reasonably think that the platform hosting the hypothetical defamation should have any liability there. Like this is actually a braindead take.

reverendsteveii@lemm.ee on 20 Mar 2024 17:26 next collapse

this protection does not extend to knowingly facilitating or encouraging illegal activities.

if it’s illegal to encourage illegal activities it’s illegal to build an algorithm that automates encouraging illegal activities

afraid_of_zombies@lemmy.world on 20 Mar 2024 20:39 collapse

What if I built software to build software to do it?

hybridhavoc@lemmy.world on 20 Mar 2024 19:40 collapse

Repealing Section 230 would actually have the opposite effect, and lead to less moderation as it would incentivize not knowing about the content in the first place.

afraid_of_zombies@lemmy.world on 20 Mar 2024 20:39 collapse

I can’t see that. Not knowing about it would be impossible position to maintain since you would be getting reports. Now you might say they will disable reports which they might try but they have to do business with other companies who will require that they do. Apple isn’t going to let your social media app on if people are yelling at Apple about the child porn and bomb threats on it, AWS will kick you as well, even Cloudflare might consider you not worth the legal risk. This has already happened multiple times even with section 230 providing a lot of immunity to these companies. Without that immunity they would be even more likely to block.

baru@lemmy.world on 20 Mar 2024 14:58 next collapse

Those sites determine what they promote. Such sites often promote extreme views as it gets people to watch or view the next thing. Facebook for instance researched this outcome, then ignored that knowledge.

reverendsteveii@lemm.ee on 20 Mar 2024 17:25 collapse

is the pusher guilty? last time I checked he didn’t grow the poppies or process them into heroin.

yarr@feddit.nl on 20 Mar 2024 18:00 next collapse

That’s why we have separate charges for drug manufacturing and distribution.

afraid_of_zombies@lemmy.world on 20 Mar 2024 20:42 collapse

I never liked that logic it’s basically “success has many father’s but failure is an orphan” applied.

Are you involved with something immoral? To the extent of your involvement is the extent of how immoral your actions are. Same goes for doing the right thing.

The_Tired_Horizon@lemmy.world on 20 Mar 2024 16:28 next collapse

I gave up reporting on major sites where I saw abuse. Stuff that if you said that in public, also witnessed by others, you’ve be investigated. Twitter was also bad for responding to reports with “this doesnt break our rules” when a) it clearly did and b) probably a few laws.

reverendsteveii@lemm.ee on 20 Mar 2024 17:23 next collapse

I gave up after I was told that people DMing me photographs of people committing suicide was not harassment but me referencing Yo La Tengo’s album “I Am Not Afraid Of You And I Will Beat Your Ass” was worthy of a 30 day ban

PiratePanPan@lemmy.dbzer0.com on 20 Mar 2024 18:22 next collapse

I remember one time somebody tweeted asking what the third track off Whole Lotta Red and I watched at least 50 people get perma’d before my eyes.

The third track is named Stop Breathing.

LiveLM@lemmy.zip on 20 Mar 2024 22:47 collapse

I TAKE MY SHIRT OFF AND ALL THE HOES STOP BREATHIN’ accessing their Twitter accounts WHEH? 🧛‍♂️🦇🩸

PiratePanPan@lemmy.dbzer0.com on 02 Apr 2024 19:14 collapse

sLatt! +*

The_Tired_Horizon@lemmy.world on 20 Mar 2024 19:22 collapse

On youtube I had a persistent one who only stopped threatening to track me down and kill me (for a road safety video) when I posted the address of a local police station and said “pop in, any time!”

[deleted] on 20 Mar 2024 18:00 collapse

.

cows_are_underrated@feddit.de on 20 Mar 2024 18:55 next collapse

That’s true, but a lotnof things are illegal eeverywhere. Sexual Harassment or death treads will get you a lawsuit in probably every single country of the world.

prole@sh.itjust.works on 20 Mar 2024 19:18 collapse

Lawsuits are for civil cases. If someone breaks a law, they’re charged by authorities at their discretion.

The_Tired_Horizon@lemmy.world on 20 Mar 2024 19:23 collapse

Laws against threats to kill, rape and assault tend to be pretty constant across the world… 🤷‍♂️

otp@sh.itjust.works on 20 Mar 2024 20:34 collapse

Americans online regularly tell me that that’s protected free speech down there! Haha

[deleted] on 20 Mar 2024 18:50 next collapse

.

[deleted] on 20 Mar 2024 19:17 collapse

.

[deleted] on 20 Mar 2024 19:23 collapse

.

[deleted] on 20 Mar 2024 19:34 collapse

.

[deleted] on 20 Mar 2024 19:49 collapse

.

[deleted] on 20 Mar 2024 19:51 next collapse

.

[deleted] on 20 Mar 2024 19:56 collapse

.

[deleted] on 20 Mar 2024 20:03 collapse

.

[deleted] on 20 Mar 2024 20:28 collapse

.

[deleted] on 20 Mar 2024 20:38 next collapse

.

[deleted] on 20 Mar 2024 20:44 collapse

.

[deleted] on 20 Mar 2024 21:46 collapse

.

[deleted] on 20 Mar 2024 22:19 collapse

.

[deleted] on 20 Mar 2024 22:52 collapse

.

[deleted] on 20 Mar 2024 22:30 collapse

.

Kalysta@lemmy.world on 20 Mar 2024 19:13 next collapse

Love Reddit’s lies about them taking down hateful content when they’re 100% behind Israel’s genocide of the Palestinians and will ban you if you say anything remotely negative about Israel’s govenment. And the amount of transphobia on the site is disgusting. Let alone the misogyny.

ristoril_zip@lemmy.zip on 20 Mar 2024 19:15 next collapse

Do you mean “behind” like responsible for or in favor of?

Syringe@lemmy.world on 20 Mar 2024 20:29 next collapse

I think from context we can assume in favor of. I don’t think anyone is accusing Reddit of masterminding the Gaza conflict. I haven’t been to /r/conspiracy in a while through.

Kalysta@lemmy.world on 21 Mar 2024 13:50 collapse

In favor of. As in they support Israel unquestioningly. Though as someone else commented I wouldn’t put anything past r/conspiracy these days.

captainlezbian@lemmy.world on 20 Mar 2024 20:55 collapse

Lol, yeah I moderated major trans subreddits for years. It was entirely hit and miss if we’d get support from the admins

ristoril_zip@lemmy.zip on 20 Mar 2024 19:15 next collapse

“Noooo it’s our algorithm we can’t be held liable for the program we made specifically to discover what people find a little interesting and keep feeding it to them!”

RagingRobot@lemmy.world on 20 Mar 2024 19:23 next collapse

I wonder if you built a social media site where the main feature was that the algorithm just showed you things in sequential order like in the old days, would it be popular

Quill7513@slrpnk.net on 20 Mar 2024 19:27 next collapse

People complain about mastodons lack of algorithms a lot. Its part of how misskey, ice shrimp, and catodon came to be

Hillock@feddit.de on 20 Mar 2024 19:41 next collapse

No, there is too much content for that nowadays. YouTube has over 3 million new videos each day. Facebook, TikTok, Instagram also has ridiculous amounts of new posts every day. Browsing Reddit on New was a terrible experience on r/all or even many of the bigger subs. Even on the fediverse sorting by new is not enjoyable. You are swarmed with reposts, and content that’s entirely uninteresting to you.

It works in smaller communities but there it isn’t really necessary. You usually have an overview of all the content anyhow and it doesn’t matter how it’s ordered.

Any social media that plans on scaling up needs a more advanced system.

fosstulate@iusearchlinux.fyi on 21 Mar 2024 23:22 collapse

Fuck scale.

RaoulDook@lemmy.world on 20 Mar 2024 19:55 next collapse

I enjoy using Lemmy mostly that way, just sorting the feed by new / hot / whatever and looking at new posts of random shit. Much more entertaining than video-spamming bullshit.

afraid_of_zombies@lemmy.world on 20 Mar 2024 20:29 collapse

So a paper encyclopedia set? How is Britannica doing?

John_McMurray@lemmy.world on 20 Mar 2024 21:19 collapse

I find it very weird to be living in a country legalizing drugs and assisted suicide (even for depression) but simultaneously trying to severely curtail free speech, media freedom and passing legislation to jail people at risk of breaking the law who don’t meet the “conspiracy to commit” threshold".

BreakDecks@lemmy.ml on 20 Mar 2024 21:43 collapse

What exactly are you referring to here?

John_McMurray@lemmy.world on 20 Mar 2024 21:46 next collapse

Been on the canadian news the last few days, federal Liberals are trying to bring in pre-crime legislation.

BreakDecks@lemmy.ml on 21 Mar 2024 01:14 collapse

This thread is about a mass shooter from New York. Are you lost?

[deleted] on 21 Mar 2024 02:39 collapse

.

John_McMurray@lemmy.world on 20 Mar 2024 21:49 collapse
huquad@lemmy.ml on 20 Mar 2024 19:33 next collapse

Tonight we investigate the hacker known as 4chan. More at 9.

GooglyBear@lemmy.world on 20 Mar 2024 22:24 collapse

Who is this “Four Chan”?

skozzii@lemmy.ca on 20 Mar 2024 20:10 next collapse

YouTube feeds me so much right wing bullshit I’m constantly marking it as not interested. It’s a definite problem.

afraid_of_zombies@lemmy.world on 20 Mar 2024 20:28 next collapse

I can’t prove that they were related but I used to report all conservative ads (Hillsdale Epoch times etc) to Google with all caps messages how I was going to start calling the advertisers directly and yell at them for the ads, about 2-3 days after I started doing that the ads stopped.

I would love for other people to start doing this to confirm that it works and to be free of the ads.

Tom_Hanx_Hail_Satan@lemmy.ca on 20 Mar 2024 22:13 next collapse

That worked for me also. I like a lot of sports docs on YouTube. That triggered non stop Joe Rogan suggestions and ads for all kinds of right wing news trash.

[deleted] on 20 Mar 2024 22:18 next collapse

.

afraid_of_zombies@lemmy.world on 20 Mar 2024 23:04 collapse

Can you try my method and see if it works? I am really curious

[deleted] on 21 Mar 2024 03:42 collapse

.

Krudler@lemmy.world on 20 Mar 2024 22:27 collapse

I quit drinking years ago and I reported every (e alcohol) ad explaining that I am no longer their target market and the ads are literally dangerous to me. They were gone within a few weeks - haven’t seen a booze ad in 5+ years.

S_H_K@lemmy.dbzer0.com on 20 Mar 2024 20:37 next collapse

Is fucking insane how much that happens I stopped using Instagram for that reason at least yt listened to my “not interested” choices. I also have revnaced so IDK what ads it would shoot at me.

Duamerthrax@lemmy.world on 20 Mar 2024 20:57 next collapse

It’s amazing how often I get a video from some right wing source suggested to me companting about censorship and being buried by youtube. I ended up installing a third party channel blocker to deal with it.

CaptPretentious@lemmy.world on 20 Mar 2024 23:34 collapse

YouTube started feeding me that stuff too. Weirdly once I started reporting all of them as misinformation they stop showing up for some reason…

FiniteBanjo@lemmy.today on 20 Mar 2024 20:16 next collapse

Maybe this will lead to a future where Stochastic Terrorism isn’t a protected activity?

blazera@lemmy.world on 20 Mar 2024 20:57 next collapse

Personally I believe in free will. Nothing should take any responsibility away from the one that chose to kill.

BreakDecks@lemmy.ml on 20 Mar 2024 21:44 collapse

Justice isn’t a zero sum game. More than one person can contribute to the same crime.

blazera@lemmy.world on 21 Mar 2024 20:39 collapse

sure, if there were multiple shooters

EmperorHenry@discuss.tchncs.de on 20 Mar 2024 21:00 next collapse

So now anyone who says things is going to be held accountable for crazy people being crazy?

What a lovely world we live in. That’s worse than what CNN kept saying about the joker after that one mass shooting at the theater that happened to be showing “The Dark Knight” at the same time.

PoliticalAgitator@lemmy.world on 21 Mar 2024 04:10 collapse

So now anyone who says things is going to be held accountable for crazy people being crazy

Nope, that’s just you being melodramatic. The judge has acknowledged there is grounds for the case to be argued so it won’t he dismissed. That’s all. They haven’t been found guilty of anything. They’re not being lined up and shot.

What would you prefer we did to determine if a company is culpable? Just ask you because you read a headline?

Canyon201@lemmy.world on 20 Mar 2024 21:29 next collapse

Right in the IPO price!

Krudler@lemmy.world on 20 Mar 2024 22:43 next collapse

I just would like to show something about Reddit. Below is a post I made about how Reddit was literally harassing and specifically targeting me, after I let slip in a comment one day that I was sober - I had previously never made such a comment because my sobriety journey was personal, and I never wanted to define myself or pigeonhole myself as a “recovering person”.

I reported the recommended subs and ads to Reddit Admins multiple times and was told there was nothing they could do about it.

I posted a screenshot to DangerousDesign and it flew up to like 5K+ votes in like 30 minutes before admins removed it. I later reposted it to AssholeDesign where it nestled into 2K+ votes before shadow-vanishing.

Yes, Reddit and similar are definitely responsible for a lot of suffering and pain at the expense of humans in the pursuit of profit. After it blew up and front-paged, “magically” my home page didn’t have booze related ads/subs/recs any more! What a totally mystery how that happened /s

The post in question, and a perfect “outing” of how Reddit continually tracks and tailors the User Experience specifically to exploit human frailty for their own gains.

<img alt="" src="https://lemmy.world/pictrs/image/a54c9b72-b466-4cf6-891a-6f0d7487bbc0.png">

Edit: Oh and the hilarious part that many people won’t let go (when shown this) is that it says it’s based on my activity in the Drunk reddit which I had never once been to, commented in, posted in, or was even aware of. So that just makes it worse.

mlg@lemmy.world on 20 Mar 2024 23:16 next collapse

Its not reddit if posts don’t get nuked or shadowbanned by literal sitewide admins

Krudler@lemmy.world on 21 Mar 2024 01:43 collapse

Yes I was advised in the removal notice that it had been removed by the Reddit Administrators so that they could keep Reddit “safe”.

I guess their idea of “safe” isn’t 4+ million users going into their privacy panel and turning off exploitative sub recommendations.

Idk though I’m just a humble bird lawyer.

KairuByte@lemmy.dbzer0.com on 21 Mar 2024 14:35 collapse

Yeah this happens a lot more than people think. I used to work at a hotel, and when the large sobriety group got together yearly, they changed bar hours from the normal hours, to as close to 24/7 as they could legally get. They also raised the prices on alcohol.

porksoda@lemmy.world on 20 Mar 2024 23:22 next collapse

Back when I was on reddit, I subscribed to about 120 subreddits. Starting a couple years ago though, I noticed that my front page really only showed content for 15-20 subreddits at a time and it was heavily weighted towards recent visits and interactions.

For example, if I hadn’t visited r/3DPrinting in a couple weeks, it slowly faded from my front page until it disappeared all together. It was so bad that I ended up writing a browser automation script to visit all 120 of my subreddits at night and click the top link. This ended up giving me a more balanced front page that mixed in all of my subreddits and interests.

My point is these algorithms are fucking toxic. They’re focused 100% on increasing time on page and interaction with zero consideration for side effects. I would love to see social media algorithms required by law to be open source. We have a public interest in knowing how we’re being manipulated.

CaptPretentious@lemmy.world on 20 Mar 2024 23:33 next collapse

YouTube does the exact same thing.

Smokeless7048@lemmy.world on 21 Mar 2024 05:25 collapse

thats why i always use youtube by subscribed first, then only delve into regular front page if theres nothing interesting in my subscriptions

Carlo@lemmy.ca on 21 Mar 2024 01:07 next collapse

Yeah, social media algorithms are doing a lot of damage. I wish there was more general awareness of this. Based on personal experience, I think many people actually like being fed relevant content, and are blind to the consequences. I think Lemmy is great, because you have to curate your own feed, but many people would never use it for that very reason. I don’t know what the solution is.

Fedizen@lemmy.world on 21 Mar 2024 05:37 next collapse

I used google news phone widget years ago and clicked on a giant asteroid article, and for whatever reason my entire feed became asteroid/meteor articles. Its also just such a dumb way to populate feeds.

TopRamenBinLaden@sh.itjust.works on 21 Mar 2024 06:12 collapse

<img alt="" src="https://sh.itjust.works/pictrs/image/19ef08be-3669-474a-86a9-d533b199d714.jpeg">

Yo dawg, I heard you like asteroids. So I populated your entire feed with articles about asteroids.

r3df0x@7.62x54r.ru on 21 Mar 2024 16:17 collapse

I agree. It’s important to remember the only “conspiracy” is making money and keeping people on the platform. That said, it will cause people to go down rabbit holes. The solution isn’t as simple as “show people content they disagree with” because they either ignore it or it creates another rabbit hole. For example, it would mean that progressives start getting bombarded with Tim Pool videos. I don’t believe Tim is intentionally “alt right” but that’s exactly why his videos are the most dangerous. They consist of nothing but conservative rage bait with a veneer of progressiveness that allows his viewers to believe they aren’t being manipulated.

charonn0@startrek.website on 20 Mar 2024 23:38 next collapse

I think there’s definitely a case to be made that recommendation algorithms, etc. constitute editorial control and thus the platform may not be immune to lawsuits based on user posts.

UsernamesAreDifficult@lemmy.dbzer0.com on 21 Mar 2024 01:24 next collapse

Honestly, good, they should be held accountable and I hope they will be. They shouldn’t be offering extremist content recommendations in the first place.

whoreticulture@lemmy.world on 21 Mar 2024 02:07 next collapse

How does Lemmy pick which articles are at the top of my feed? Does anyone know how All is sorted?

realbadat@programming.dev on 21 Mar 2024 02:27 collapse

Same as sort settings for subscribed.

Active, top for the past x hours/days/month/all, new, etc. You pick, Lemmy doesn’t.

Fedizen@lemmy.world on 21 Mar 2024 05:34 next collapse

media: Video games cause violence

media: Weird music causes violence.

media: Social media could never cause violence this is censorship (also we don’t want to pay moderators)

Eximius@lemmy.world on 21 Mar 2024 07:20 collapse

Since media (that you define by the trophes of unsubtantiated news outlets) couldnt sensibly refer to a forum like reddit or even facebook, this makes no sense.

casual_turtle_stew_enjoyer@sh.itjust.works on 21 Mar 2024 16:24 next collapse

I will testify under oath with evidence that Reddit, the company, has not only turned a blind eye to but also encouraged and intentfully enabled radicalization on their platform. It is the entire reason I am on Lemmy. It is the entire reason for my username. It is the reason I questioned my allyship with certain marginalized communities. It is the reason I tense up at the mention of turtles.

ItsMeSpez@lemmy.world on 21 Mar 2024 16:47 next collapse

As much as I believe it is a breeding ground for right wing extremism, it’s a little strange that 4chan is being lumped in with these other sites for a suit like this. As far as I know, 4chan just promotes topics based on the number of people posting to it, and otherwise doesn’t employ an algorithm at all. Kind of a different beast to the others, who have active algorithms trying to drive engagement at any cost.

iquanyin@lemmy.world on 21 Mar 2024 16:52 next collapse

anything but gun regulation, i guess.

TropicalDingdong@lemmy.world on 21 Mar 2024 20:48 next collapse

I don’t understand how a social media company can face liability in this circumstance but a weapons manufacturer doesn’t.

gum_dragon@lemm.ee on 21 Mar 2024 21:57 collapse

Or individuals who repeatedly spreading the specific hateful ideology that radicalize people and also encourages them to act on it

PoliticallyIncorrect@lemm.ee on 22 Mar 2024 03:52 collapse

What about shutting down the whole internet?