Study Finds LLMs Biased Against Men in Hiring (www.piratewires.com)
from Allah@infosec.pub to technology@lemmy.world on 06 Jul 09:46
https://infosec.pub/post/31047053

#technology

threaded - newest

hendrik@palaver.p3x.de on 06 Jul 09:57 next collapse

LLMs reproducing stereotypes is a well researched topic. They do that due to what they are. Stereotypes and bias in (in the training data), bias and stereotypes out. That's what they're meant to do. And all AI companies have entire departments to tune that, measure the biases and then fine-tune it to whatever they deem fit.

I mean the issue aren't women or anything, it's using AI for hiring in the first place. You do that if you want whatever stereotypes Anthropic and OpenAI gave to you.

kambusha@sh.itjust.works on 06 Jul 11:12 collapse

Just pattern recognition in the end, and extrapolating from that sample size.

hendrik@palaver.p3x.de on 06 Jul 11:59 collapse

Issue is they probably want to pattern-recognize something like merit / ability / competence here. And ignore other factors. Which is just hard to do.

ohwhatfollyisman@lemmy.world on 06 Jul 10:28 next collapse

and their companies are biased against humans in hiring.

[deleted] on 06 Jul 10:40 next collapse

.

grober_Unfug@discuss.tchncs.de on 06 Jul 10:47 next collapse

So they admit, that there’s a huge bias against women, black people, …

And then they claim it must be a bias against men. Maybe it’s not a bias, maybe it’s the interpretation of studies which found out that there are certain areas where women are better in their jobs than men, and the AI considered those studies despite the bias against women.

Leadership & Management

Study: Harvard Business Review (2019) Finding: Women scored higher than men in 12 out of 16 leadership competencies.

hbr.org/…/research-women-score-higher-than-men-in…

Medicine

Study 1: JAMA Internal Medicine (2017) Finding: Patients treated by female doctors had lower mortality rates.

jamanetwork.com/journals/…/2593255

Stdy 2: Annals of Internal Medicine (2024, UCLA) Funding: Female patients treated by female doctors had 8.15% mortality vs 8.38% with male doctors (2016–2019 data)

uclahealth.org/…/treatment-female-doctors-leads-l…

Sales Performance

Source: Xactly Insights (2017) Finding: 86% of women met their sales quotas, vs. 78% of men.

forbes.com/…/women-in-sales-beating-the-numbers/

Education / Teaching

Source: OECD TALIS Survey Finding: Female teachers report better classroom climate and higher student engagement.

www.oecd.org/en/about/programmes/talis.html

Edit: I can see quite a lot of offended men :)

catty@lemmy.world on 06 Jul 10:58 next collapse

Handpicks poor ‘studies’ to justify personal belief that women are better.

cabbage@piefed.social on 06 Jul 11:14 next collapse

At least where I'm from it's pretty well known that girls outperform boys in school, possibly because their brains develop slightly earlier in some ways useful to perform in a class room.

This could give women a head start and very well lead to them on average performing better in work life, until they are forced to choose between careers and families while they partners continue to advance their careers at full speed not worrying about being pregnant.

But that's a different discussion. We should avoid biases in hiring because biases suck and make for an unjust society. And we should stop pretending language models make intelligent considerations about anything.

What's fascinating here is that LLMs trained on the texts we produce create the opposite bias of what we observe in society, where men tend to get preferential treatment. My guess is that this is a consequence of inclusive language. In my writing, whenever women are under-represented, I make a point out of defaulting to she and her rather than he and him. I know others do the same. I imagine this could feed into LLMs. Whatever it is that causes this, it sure as fuck isn't anything actually intelligent.

catty@lemmy.world on 06 Jul 11:26 collapse

At least where I’m from it’s pretty well known that girls outperform boys in school, probably because their brains develop slightly faster in some ways useful to perform in a class room.

At least where I’m from, it’s pretty well know that the education system is better suited to girls than boys because it badly needs a reform, and that studies show that separating boys and girls greatly improves the attainment of boys whilst also making modest improvement for girls. It’s as if combining boys and girls in the name of politically-correct inclusion actually had a detrimental effect on the actual outcome of schooling.

This could give women a head start and very well lead to them on average performing better in work life, until they are forced to choose between careers and families while they partners continue to advance their careers at full speed not worrying about being pregnant.

To paraphrase this: women can get pregnant and then can’t work and it’s the man’s fault. I thinks someone’s watched too much Handmaid’s Tale

It sure isn’t the extreme and aggressive pandering to feminism that gives women a “head start” just because they’re women and not because they’re the best for the job (such as all the women-only startups, the women-only software teams, the grants given to the above because they’re women-only), but because they’re better educated, right?

we should stop pretending language models make intelligent considerations about anything.

LLMs trained on the texts we produce create the opposite bias of what we observe in society

So you’re actually stating that LLMs are making dumb decisions by recommending women over men. And they’re your observations, from your model of the world, framed as you want. What I observe in society is a huge increase in the amount of advertising aimed at women with a feminist message because women are being programmed to flock to such messages (to buy products), whilst ironically conditioning them and giving them entitlement to claim anything they want, because, they’re women.

Given the actual reality, this shows how extreme the bias against men in current literature really is and it isn’t surprising that LLMs are recommending women over men given all the noise about how women are vastly superior to men.

If I were a woman concerned about equality, I think I’d be having an epiphany about now, and using this as an example of how bad things are and that they need to change.

cabbage@piefed.social on 06 Jul 11:40 collapse

At least where I'm from, it's pretty well know that the education system is better suited to girls than boys, probably because it needs a reform

I didn't say it doesn't, clearly there's a problem when half the population is systematically favoured.

To paraphrase: women can get pregnant and can't work and it's the man's fault

Where the fuck did I say that it's the man's fault? It's a societal problem, doesn't mean it's anybody's fault. At least not an entire gender in general. Capitalism as a system, yeah, probably.

What I observe in society are a huge increase in the amount of advertising aimed at women with a feminist message because women are being programmed to flock to such messages

I'm the first to criticize corporate feminism (just like greenwashing and pride washing), but I suspect feminist messaging appeals to women because they are sick of the patriarchy, not because they are programmed by marketing agencies. The fuck are you on about.

That said, I think you're right that the messaging of companies trying to appear feminist in their communications while nevertheless usually being run almost exclusively by men is a huge part of the source material that produces the bias here. I'm not sure we disagree much in substance, but I suspect we come from different starting points in how we see gender dynamics in society.

catty@lemmy.world on 06 Jul 12:03 collapse

So now you’re backtracking and disagreeing that it isn’t because girl’s brains develop faster, but because the education system is actually better for girls than boys? Oh, right, so why didn’t you write that in the first place.

I suspect feminist messaging appeals to women because they are sick of the patriarchy, not because they are programmed by marketing agencies. The fuck are you on about.

Lol. Aren’t you a good feminist. Throwing tired phrases around like “they are sick of the patriarchy”, yawn. You’re the sexist version of anti-vaxxers.

I think you’re right that the messaging of companies trying to appear feminist in their communications while nevertheless usually being run almost exclusively by men is a huge part of the source material that produces the bias here.

Lol, that isn’t what I wrote - again, it’s what your distorted view of the world understands. Of course, women work in such companies and also approve such messages to meet sales. Shock. But yawn, again, from your pov, it’s the men’s fault because that opinion justifies your hatred of men - it’s them, not me.

That said, I think you’re right that the messaging of companies trying to appear feminist in their communications while nevertheless usually being run almost exclusively by men is a huge part of the source material that produces the bias here. I’m not sure we disagree much in substance, but I suspect we come from different starting points in how we see gender dynamics in society.

cabbage@piefed.social on 06 Jul 12:12 collapse

It's not men against women, it's people against billionaires.

It's not the fact that these people are men that I take issue with, it's that they are hypocrites capitalising feminist sentiments without making any actual effort towards real change.

Edit: Since I wrote my response the comment I responded to was changed into something even dumber. I'll let it speak for itself.

technocrit@lemmy.dbzer0.com on 06 Jul 13:32 collapse

Handpicked poor study… That’s what this whole OP is about.

catty@lemmy.world on 06 Jul 13:52 collapse

Allow me to do what feminists do - including in this very thread:

“Women can’t take it”

cabbage@piefed.social on 06 Jul 11:02 next collapse

the AI considered

Sorry to break it to you, but the "AI" does not "consider" anything. They are talking about a language prediction model.

protist@mander.xyz on 06 Jul 11:21 next collapse

This isn’t exactly a comprehensive literature review, and totally misunderstands what a LLM is and does

hendrik@palaver.p3x.de on 06 Jul 12:02 next collapse

Right. If it's true that women statistically outperform men (with same application documents), it'd be logical to prefer them just on gender alone. Because they likely turn out to be better.

grober_Unfug@discuss.tchncs.de on 06 Jul 12:11 collapse

Thanks for the voice of reason in this sea of hate.

From my pov it would be best to have completely anonymised applications and no involvement of AI in the hiring process.

hendrik@palaver.p3x.de on 06 Jul 12:26 next collapse

You're welcome. I mean it's kind of a factual question. Is gender an indicator on its own? If yes, then the rest is just how statistics and probability work... And that's not really a controversy. Maths in itself works 🥹

I'd also welcome if we were to cut down on unrelated stuff, stereotypes and biases. Just pick what you like to optimize for and then do that. At least if you believe in the free market in that way. Of course it also has an impact on society, people etc and all of that is just complex. And then women and men aren't really different, but at the same time they are. And statistics is more or less a tool. Highly depends on what you do with it and how you apply it. It's like that with most tools. (And LLMs in the current form are kind of a shit tool for this if you ask me.)

grober_Unfug@discuss.tchncs.de on 06 Jul 12:45 collapse

Is gender an indicator on its own?

I’m not sure if you mean the social construct or the sex assigned at birth. Probably the latter as you mentioned “on its own”.

I have a lot of issues with the social construct as it’s basically a nicer word for “stereotype”. It looks like men and women alike suffer because of these stereotypes. The social constructs, the stereotypes, are the basis for bias. To me it seems like gender never is “on its own”. It’s the way we perceive the biological sex and compare it to our expectations.

Sex on the other hand is no indicator on its own, I think.

And I agree statistics is always a problem, that’s why LLMs are problematic in a lot of ways.

hendrik@palaver.p3x.de on 06 Jul 14:06 collapse

I meant both sex and gender. They regularly fail to tell me a lot for my own real life. I like some people and dislike others and it's easier for me to talk to / work with / collaborate or empathize depending on various circumstances. Personality traits, shared goals... Maybe sharing something or it's the opposite of that. I believe gender or sex or identity is a bit overrated and so is stereotyped thinking for a lot of applications. Or the need to conform to a stereotype. Dress and identify however you like, make sure to give your children an electronics kit, a plastic excavator and a princess dress... And unless that's really important for some niche application, don't feel the urge to look into people's pants and check what's in there.

cabbage@piefed.social on 06 Jul 12:28 collapse

For most jobs it's hard to do a hiring process without in-person interviews, or at the very least video calls. So I'm not really sure how one could realistically get rid of biases. But I completely agree that whenever there are too many applications to interview everyone individually, the initial screening of applicants should be completely anonymized and rely only only technologies where biases can at least be understood.

For the final step I'm afraid we'll have to try to train people to be less prone to biased decision-making. Which I agree is not a very promising path.

jwmgregory@lemmy.dbzer0.com on 06 Jul 13:08 collapse

the problematic part of this is that you’ve stripped all context to support your, admittedly bigoted, rhetoric and ethos.

black people, generally, have worse education outcomes than whites in american education. you’d still be an incredibly shitty and terrible person if you advocated hiring white people over black people by rote rule. you can find plenty of “studies” that formalize that argument just as you have here, though. essentialist can just say whatever they want, you guys aren’t bounded by rational thought and critical thinking like the rest of us. no, arguing considering context would be too hard. you’d rather just sort people into nice little easy bins, wouldn’t you?

no, i think most rational people understand that in a scenario like this all people have, on average, the same basic cognitive faculties and potential, and would then proceed to advocate for improving the educational conditions for groups that are falling behind not due to their own nature, but due to the system they are in.

but idk, i’m not a bigot so maybe my brain just implicitly rejects the idea “X people are worse/less intelligent/etc than Y people”

fucking think about what you’re saying. there is no “right people” to hate other than the rich and powerful. it isn’t a subversion of the feminist message to admit this. in fact, it makes you a better feminist. real feminist aren’t sexist.

can you imagine if you said this in a racial context and then you made an edit like “edit: can see i offended a lot of darkies with this :)”… are you dense? can you not see how you are engaging in the same kind of thought that oppressed you and likely spurred you towards feminism in the first place? except you don’t understand that what you do is patently unfeminist and makes the world a worse place. i can honestly say i fucking despise bigots, including people just like you.

burgerpocalyse@lemmy.world on 06 Jul 11:28 next collapse

these systems cannot run a lemonade stand without shitting their balls

technocrit@lemmy.dbzer0.com on 06 Jul 13:35 next collapse

I dunno why people even care about this bullshit pseudo-science. The study is dumb AF. The dude didn’t even use real resumes. He had an LLM generate TEN fake resumes and then the “result” is still within any reasonable margin of error. Reading this article is like watching a clown show.

It’s all phony smoke and mirrors. Clickbait. The usual “AI” grift.

kozy138@slrpnk.net on 06 Jul 14:11 collapse

I feel as though generating these “fake” resumes is one of the top uses for LLMs. Millions of people are probably using LLMs to write their own resumes, so generating random ones seems on par with reality.

OutlierBlue@lemmy.ca on 06 Jul 14:00 next collapse

So we can use Trump’s own anti-DEI bullshit to kill off LLMs now?

thann@lemmy.dbzer0.com on 06 Jul 17:30 collapse

Well, ya see, trump isnt racist against computers

LovableSidekick@lemmy.world on 06 Jul 17:39 next collapse

Only half kidding now… the way morality and ethics get extrapolated now by the perfection police, this must mean anti-AI = misogynist.

ter_maxima@jlai.lu on 06 Jul 17:53 next collapse

I don’t care what bias they do and don’t have ; if you use an LLM to select résumés, you don’t deserve to hire me. I make my résumé illegible for LLMs on purpose.

( But don’t follow my advice. I don’t actually need a job so I can pull this kinda nonsense and be selective, most people probably can’t )

patrick@lemmy.bestiver.se on 07 Jul 01:39 collapse

How do you make it illegible for LLMs?

Alwaysnownevernotme@lemmy.world on 07 Jul 02:04 collapse

You write a creative series of deeply offensive curse words in small white on white print.

MysticKetchup@lemmy.world on 06 Jul 18:04 next collapse

Seems like a normal, sane and totally not-biased source

<img alt="" src="https://lemmy.world/pictrs/image/0049da2d-e23a-4155-89a3-88da6d1347bf.jpeg">

Allah@infosec.pub on 06 Jul 18:24 next collapse

ah, mamdani the guy who dehumanized hindus

stephen01king@lemmy.zip on 06 Jul 19:31 next collapse

That article has a lot of reaching between the proof it claims and its conclusion.

PTSDwarrior@lemmy.ml on 06 Jul 19:35 collapse

Because quite frankly, they hate Muslims. Thats all you need to understand about them. Its not a rational weariness of Islam like most reasonable people have. No, they want to kill all Muslims in India.

PTSDwarrior@lemmy.ml on 06 Jul 19:34 collapse

OP India is right wing garbage and I’m ashamed I ever got featured in their garbage website, once upon a time.

AbidanYre@lemmy.world on 06 Jul 22:15 collapse

What the fuck did I just read?

sugar_in_your_tea@sh.itjust.works on 06 Jul 20:20 next collapse

They’re as biased as the data they were trained on. If that data leaned toward male applicants, then yeah, it makes complete sense.

MuskyMelon@lemmy.world on 07 Jul 01:59 next collapse

Even before LLMs, resumes were processed through keyword filters already. You have to optimize your resume for keyword readers, which should work for LLMs as well.

I use the ARCI model to describe my roles.

mienshao@lemmy.world on 07 Jul 11:12 next collapse

Would be cool if the Technology community found literally any other topic to discuss beyond AI. I’m really over it, and I don’t care.

berno@lemmy.world on 07 Jul 12:26 collapse

Bias was baked in via RLHF and also existed in the datasets used for training. Reddit cancer grows