Wikipedia is gauging interest for an extension that uses AI to see if any claim is cited on Wikipedia (meta.wikimedia.org)
from Aatube@kbin.melroy.org to technology@lemmy.world on 09 Apr 2024 00:05
https://kbin.melroy.org/m/technology@lemmy.world/t/199027

A prototype is available, though it's Chrome-only and English-only at the moment. How this'll work is you select some text and then click on the extension, which will try to "return the relevant quote and inference for the user, along with links to article and quality signals".

How this works is it uses ChatGPT to generate a search query, utilizes WP's search API to search for relevant article text, and then uses ChatGPT to extract the relevant part.

#ai #browser #extension #technology #wikipedia

threaded - newest

dukethorion@lemmy.one on 09 Apr 2024 00:34 next collapse

How would this be different from any browser that has Wikipedia search built in?

OsrsNeedsF2P@lemmy.ml on 09 Apr 2024 00:59 next collapse

AI!

Aatube@kbin.melroy.org on 09 Apr 2024 01:00 next collapse

It could run the search in the background

sbv@sh.itjust.works on 09 Apr 2024 01:45 next collapse

Presumably it would evaluate claims in the text without the user having to do the search. Sounds cool to me.

dukethorion@lemmy.world on 09 Apr 2024 02:24 collapse

It says the user has to highlight then click the extension.

I can currently right-click and then click “Search on Wikipedia” in the context menu. I believe this works in both FF and chromium browsers.

Fuck AI.

Aatube@kbin.melroy.org on 09 Apr 2024 02:29 next collapse

I prefer a floating button next to the text than having to set my default search engine to Wikipedia or downloading an addon that only adds it to the context menu, lol. I need to complete my unholy trinity of levitating context buttons

PlantJam@lemmy.world on 09 Apr 2024 12:51 collapse

And here I am going out of my way to disable any and all floating buttons.

FooBarrington@lemmy.world on 09 Apr 2024 10:08 collapse

I tried with your comment: en.wikipedia.org/wiki/Special:Search?go=Go&ns0=1&….

Why doesn’t this work? If your complaint were valid, this should work.

Aatube@kbin.melroy.org on 09 Apr 2024 13:10 collapse

Not sure where the not-working is here.

FooBarrington@lemmy.world on 09 Apr 2024 13:36 collapse

Where do you see it working? I see the result:

There were no results matching the query.

The extension mentioned in the post is supposed to:

return the relevant quote and inference for the user, along with links to article and quality signals

I don’t see any relevant quotes, or links to articles, or quality signals.

Aatube@kbin.melroy.org on 09 Apr 2024 15:00 collapse

Yeah, because there is no relevant Wikipedia information about what you searched for.

FooBarrington@lemmy.world on 09 Apr 2024 15:14 collapse

You are 100% sure that there is no mention of Wikipedia being integrated into Firefox/Chrome in any page? How thoroughly have you checked?

Aatube@kbin.melroy.org on 09 Apr 2024 15:44 collapse

Due to Russell's Teapot, I cannot be thoroughly sure of that, at least in article space. However, that does not mean your claim stands, unless you find a mention.

In principle, the media has no reason to cover this, so no mention should exist.

afraid_of_zombies@lemmy.world on 10 Apr 2024 01:30 next collapse

It doesn’t have a friendly looking duck as a logo.

[deleted] on 26 Sep 2024 13:44 collapse

.

db2@lemmy.world on 09 Apr 2024 00:55 next collapse

👎

AlternateRoute@lemmy.ca on 09 Apr 2024 00:56 next collapse

Sounds like Wikipedia search but with slower more expensive steps.

vhstape@lemmy.sdf.org on 09 Apr 2024 02:22 next collapse

Is it that hard to fact-check things?? Not to mention, a quick web search uses much less power/resources compared to AI inference…

Aatube@kbin.melroy.org on 09 Apr 2024 02:30 next collapse

Well, the hard truth is that AI's convenient and sells

swordsmanluke@programming.dev on 09 Apr 2024 15:53 collapse

a quick web search uses much less power/resources compared to AI inference

Do you have a source for that? Not that I’m doubting you, just curious. I read once that the internet infrastructure required to support a cellphone uses about the same amount of electricity as an average US home.

Thinking about it, I know that LeGoog has yuge data centers to support its search engine. A simple web search is going to hit their massive distributed DB to return answers in subsecond time. Whereas running an LLM (NOT training one, which is admittedly cuckoo bananas energy intensive) would be executed on a single GPU, albeit a hefty one.

So on one hand you’ll have a query hitting multiple (comparatively) lightweight machines to lookup results - and all the networking gear between. One the other, a beefy single-GPU machine.

(All of this is from the perspective of handling a single request, of course. I’m not suggesting that Wikipedia would run this service on only one machine.)

barsoap@lemm.ee on 09 Apr 2024 18:14 next collapse

A simple web search is going to hit their massive distributed DB to return answers in subsecond time.

It’s going to hit an index, not the actual data, it’s going to return approximate and not accurate results. Tons of engineering been done around basic search precisely to get more data locality.

Read a blog post at some time (please don’t ask me where) talking about Bing vs. Google when Bing started to use ChatGPT and it basically boiled down to “Google has the tech to do it, they don’t roll it out because they don’t want to eat the electricity bill this is MS spending money to get market share”. The cost difference in providing search vs. having ChatGPT answer a question was something like 10x. It might not be that way forever what with beating models down to work in trinary and stuff, though (that’s not just massive quantisation but also much easier maths, convolutions don’t need much maths when all you deal with is -1, 0, 1 IIRC you can throw out the multiplication unit and work with nothing but shifts and adds)

sheogorath@lemmy.world on 09 Apr 2024 19:49 collapse

Based on this article, it seems that on average an LLM query costs about 10x when compared to a search engine query.

swordsmanluke@programming.dev on 10 Apr 2024 02:11 collapse

Man - that’s wild. Thank you for coming though with a citation - I appreciate it!

maxenmajs@lemmy.world on 09 Apr 2024 06:57 next collapse

Seems like a genuine attempt to use AI for good. I’m interested.

ViscloReader@lemmy.world on 09 Apr 2024 08:45 next collapse

While I see this as one of the rare nice use of IA, if the use is just to fact check some text found on the web. It could also just fetch it on the site instead of using an AI.

Might be overkill to use LLMs here I think…

Aatube@kbin.melroy.org on 09 Apr 2024 10:59 collapse

The problem arises when a site uses different wording. Wikipedia's search engine isn't that good, so that problem could make the extension fail enough times to stunt retention.

sugar_in_your_tea@sh.itjust.works on 10 Apr 2024 07:32 collapse

Wikipedia’s search engine isn’t that good

That’s a pretty big understatement. It’s pretty awful.

MSids@lemmy.world on 09 Apr 2024 10:23 next collapse

Available to all Wikipedia+ subscribers

BlueBockser@programming.dev on 09 Apr 2024 15:30 next collapse

I’m skeptical given how confident many recent AI models are at making wrong claims. Fact checking seems to be a rather poor use case for current AI models IMO.

swordsmanluke@programming.dev on 09 Apr 2024 15:41 collapse

This looks less like the LLM is making a claim so much as using an LLM to generate a search query and then read through the results in order to find anything that might relate to the section being searched.

It leans into the things LLMs are pretty good at (summarizing natural language; constructing queries according to a given pattern; checking through text for content that matches semantically instead of literally) and links directly to a source instead of leaning on the thing that LLMs only pretend to be good at (synthesizing answers).

quicksand@lemm.ee on 09 Apr 2024 16:39 collapse

AI is going to start writing entire fake research papers and books written by fake authors, just so it can be cited as a source for a high school kid using it to cheat on a 500 word essay.

afraid_of_zombies@lemmy.world on 10 Apr 2024 01:28 collapse

We have become like really shitty gods. All this power to do freaken nothing. At least the Greek gods made lighting and thunder.