More than 140 Kenya Facebook moderators diagnosed with severe PTSD (www.theguardian.com)
from robber@lemmy.ml to technology@lemmy.world on 18 Dec 20:49
https://lemmy.ml/post/23731367

#technology

threaded - newest

bassomitron@lemmy.world on 18 Dec 22:04 next collapse

I hope they win their lawsuit. I listened to a Radio lab episode a few years ago about FB moderators. The shit they have to see day in and day out sound absolutely horrible. Pics and videos of extreme violence and child pornography sounds like it’d give any normal person some major trauma.

NOT_RICK@lemmy.world on 18 Dec 22:37 next collapse

Exposed to a firehose of the worst humanity has to offer. I can’t even imagine

PerogiBoi@lemmy.ca on 19 Dec 15:06 collapse

Go on the Reddit front page or Twitter home page with a heart rate and blood pressure monitor and scroll for 10 minutes.

You will find that in almost every instance, both measures go up. The whole point of social media is to agitate because agitation correlates with engagement which correlates with mucho ad dinero.

GrammarPolice@lemmy.world on 19 Dec 16:59 next collapse

That’s not fb’s fault though?

bassomitron@lemmy.world on 19 Dec 17:31 collapse

The company should be doing more to support these employees, that’s the point. Right now, Meta doesn’t give a fuck if their employees are getting severely traumatized trying to keep content off their platforms. They don’t pay them much, don’t offer resources for mental health, etc. A maybe bad analogy would be like a construction company having no heavy machinery safety policies and when those employees get hurt and can’t work anymore, just firing them with no worker’s comp.

For comparison, hospitals or law enforcement provide therapy and/or other mental health resources for their employees, since those jobs put their employees in potentially traumatic positions with some frequency (e.g. a doctor/nurse witnessing death a lot).

GrammarPolice@lemmy.world on 19 Dec 19:36 collapse

Yeah you’re right

the_crotch@sh.itjust.works on 20 Dec 20:26 collapse

Blows my.mind that people would post CSAM on Facebook of all places

ArbitraryValue@sh.itjust.works on 18 Dec 22:04 next collapse

One time on Reddit, a mod of /r/askhistorians described some of the content of this sort that he had seen, and he wasn’t as dispassionate about it as this article is. Just his verbal description is both disturbing and difficult to forget, so I can believe that these employees are traumatized.

With that said, what about other careers that expose people to disturbing things? I used to know a pathologist who once told me that she had a bad day because two infants died during childbirth at the hospital where she worked. She had to autopsy them. I didn’t know her well at the time so of course I assumed that she was upset for the same reason that such direct exposure to the death of babies would upset most people, but I was wrong. She was upset because she had to work late.

Why can pathologists do their job without being traumatized? Maybe the difference is that pathology isn’t something that a guy off the street just gets hired to do one day. The people who end up being pathologists usually have other options, and they choose pathology because it doesn’t particularly bother them for whatever reason. Meanwhile these moderators are immediately dumped into the deep end, so to speak, and they may not be financially secure enough to leave the job even after they experience what it is.

Can content moderation be done without traumatizing people? It isn’t a high-skilled, well-paid job so I don’t think filtering candidates the way that pathologist are filtered is practical. Not having content moderators also isn’t practical.

(I’m using pathology as an example because that’s what I know a little about, but I think my statements are probably valid for other careers, like homicide detective, which also involve regular exposure to disturbing things.)

bassomitron@lemmy.world on 18 Dec 23:13 next collapse

In my other comment, I mentioned a Radio Lab episode that discussed it. The main problem is like you say, they take people off the street and offer little to no training. Additionally, they don’t offer any mental health resources for their employees. Their pay is also pretty awful. The turnover is extremely high for these and many other reasons.

conditional_soup@lemm.ee on 19 Dec 01:47 next collapse

It’s definitely to do with work conditions. I’ve been a paramedic for fifteen years, and suffice it to say, I’ve seen (and smelled and heard) some shit. I’ve always felt that I had a harder time processing the stuff from when I worked in a busy metro system and we had to go from coding a kid who drowned just fifteen feet from a party full of adults to holding grandma’s hand and making her warm and comfy on our five minute jaunt to dialysis to “hey, there’s a car on fire and bystanders report hearing screaming from the vehicle”. I would regularly get three hours of sleep over a 72 hour period and have almost no time to process the horrible shit we saw while still having to be a functioning, caring professional for every patient along the way. The also horrible shit we saw in the slower rural area I worked in has haunted me a lot less. There’s probably more to the whole picture than that, but I’m confident that work conditions are a huge factor.

[deleted] on 19 Dec 17:13 next collapse

.

kofe@lemmy.world on 19 Dec 17:18 collapse

I’ll add that two people can experience the same trauma but only one develop PTSD. I started a lecture series on trauma a while back that started off explaining that if you find yourself starting to have symptoms, try to catch yourself and remind yourself that you’re in a learning environment. That’s just one method, and it can be difficult to maintain without further education and training

FourPacketsOfPeanuts@lemmy.world on 18 Dec 22:08 next collapse

Look… this is going to sound exceedingly stupid… but shouldn’t they find a way to use convicted sex offenders to filter out CSAM? They are not going to be traumatised by it and it saves some other poor soul from taking the brunt. How do you motivate them to actually do it? Well first one has to flag, and a second one earns a bonus every time the first one flags wrong. Motivation aligned!

</joking… but seriously though…>

AwesomeLowlander@sh.itjust.works on 19 Dec 20:07 collapse

There’s actually a lot of logic to the general idea of hiring specific population groups that don’t get traumatised by the content they’re checking. Problem is FB (and other companies in the same vein) can’t be bothered to do anything except hire the lowest bidder.

ZiemekZ@lemmy.world on 21 Dec 02:16 collapse

Come on, many 4channers would do it for free, just for entertainment, since most of them are NEET. Maybe give them some food and mattresses to sleep on in the office.

Infynis@midwest.social on 19 Dec 00:28 next collapse

This is what AI should actually be used for. Strengthen the algorithms that identify it to reduce the load humans need to review, and it should hopefully be more manageable

PerogiBoi@lemmy.ca on 19 Dec 15:07 next collapse

AI flagged my VR controller as a gun on Facebook and my account received a 30 day ban

Lennny@lemmy.world on 19 Dec 18:30 collapse

“we need more AI”. It’s like mate, we need intelligence before we even attempt to make it artificial. We’re so fucked. AIs the perfect tool for mass retardation.

Duamerthrax@lemmy.world on 19 Dec 16:47 next collapse

I’ve already seen discussions on Nazi imagery in media get flagged for promoting Nazis by those systems. And to be clear, the villains were who had the Nazi imagery and the blog was discussing how fascists use charisma.

We’ve also sand dunes get flagged for pornography when tumbr banned 18+ content.

Kolrami@lemmy.world on 19 Dec 17:48 collapse

Same Kenyans were probably used to train those AI models.

theguardian.com/…/ai-chatbot-training-human-toll-…

Magnolia_@lemmy.ca on 19 Dec 01:04 next collapse

That’s racist

JovialMicrobial@lemm.ee on 19 Dec 19:53 collapse

Holy shit!

I havent been on fb for a long time, I mean years, so I didn’t realize it had basically turned into the dumping grounds for the rest of the internet.

It reminds me of an article i read a long time ago about the police branches that have to deal with reviewing csam and the toll it takes on them.

I don’t typically have much sympathy for the police, but anyone who has to look at that stuff and basically get psychological damage in order to convict the people who create it have my respect.

Those moderators don’t deserve that shit. I hope they win their lawsuit.