The 'bias machine': Undecided voters in the US who turn to Google may see dramatically different views of the world – even when they're asking the exact same question
(www.bbc.com)
from tardigrada@beehaw.org to technology@beehaw.org on 01 Nov 20:31
https://beehaw.org/post/16843731
from tardigrada@beehaw.org to technology@beehaw.org on 01 Nov 20:31
https://beehaw.org/post/16843731
Some experts say Google is just parroting your own beliefs right back to you. It may be worsening your own biases and deepening societal divides along the way.
[…]
“Google’s whole mission is to give people the information that they want, but sometimes the information that people think they want isn’t actually the most useful,” says Sarah Presch, digital marketing director at Dragon Metrics, a platform that helps companies tune their websites for better recognition from Google using methods known as “search engine optimisation” or SEO.
[…]
“> What Google has done is they’ve pulled bits out of the text based on what people are searching for and fed them what they want to read” – Sarah Presch
threaded - newest
Beware online “filter bubbles” (2011) - Eli Pariser
ted.com/…/eli_pariser_beware_online_filter_bubble…
Yeah, that’s not new, but I feel there are still many who are unaware, although I don’t understand why.
It’s important to post these things every so often. There will never be a day when everyone already knows. :)
xkcd.com/1053/
Fuck Google
Yes, but this issue is not one we should want Google solving. We need better media literacy education throughout life.
I don’t disagree with you. I’m saying Google’s algorithm is part of the cause not the cure.
On the one hand, Google sucks. On the other hand, if people are unable to a) understand how those two snippets are not contradictory, and b) read at least one very short simplified-for-laymen Mayo Clinic article about the topic before thinking they've learned anything at all about medicine, it's hard to see the problem as being primarily due to Google. There is something deeper, and worse, going wrong when people habitually take that kind of extreme shortcut to thinking that they know the right answer about almost anything, and it has little to do with whether any one-sentence snippets they're given are biased or accurate.
…and any good search engine will find results containing keywords such as “Kamala Harris”, “Democratic”, “candidate”, and “good”.
In that case, of course the search engine will find results containing keywords such as “Kamala Harris”, “Democratic”, “candidate”, and “bad”.
So the whole premise that, “Fundamentally, that’s an identical question” is just bullshit when it comes to searching. Obviously, when you put in the keyword “good”, you’ll find articles containing “good”, and if you put in the keyword “bad”, you’ll find articles containing “bad” instead.
Google will find things that match the keywords that you put in. So does DuckDuckGo, Qwant, Yahoo, whatever. That is what a good search engine is supposed to do.
I can assure you, when search engines stop doing that, and instead try to give “balanced” results, according to whatever opaque criteria for “balanced” their company comes up with, that will be the real problem.
I don’t like Google, and only use google when other search engines fail. But this article is BS.
Ah but other than the search results there’s also a big AI summary on the top, which I’m more concerned about
Yeah. Be very, very afraid of people using search engines or “AI” as some Magic Eightball oracle to give them answers.