Vivaldi explains why they will not embed LLM functionality in their browser (vivaldi.com)
from dantheclamman@lemmy.world to technology@lemmy.world on 05 Feb 2024 17:32
https://lemmy.world/post/11596827

#technology

threaded - newest

Fudoshin@feddit.uk on 05 Feb 2024 17:38 next collapse

Didn’t even think of it as a possibility. WTF would a browser need with LLM?

dantheclamman@lemmy.world on 05 Feb 2024 17:45 next collapse

Edge is branding itself “The AI Browser”. Chrome has plans to embed LLMs for text input. Opera, the browser which was commandeered from the original Vivaldi team and turned into a crypto/VPN gimmick browser, is of course among the hardest leaning into the LLM trend.

Fudoshin@feddit.uk on 05 Feb 2024 17:49 collapse

Cheers DANNY BOOOOOOOYYY!

Even_Adder@lemmy.dbzer0.com on 05 Feb 2024 18:19 next collapse

Local translation of text comes to mind.

Fudoshin@feddit.uk on 05 Feb 2024 18:33 next collapse

I was thinking more along the lines of communicating with a Klingon captain on a D7 Battlecruiser.

AlmightySnoo@lemmy.world on 05 Feb 2024 19:27 next collapse

Yup, Firefox has it: browser.mt (it’s now a native part of Firefox)

youngGoku@lemmy.world on 05 Feb 2024 19:42 collapse

Hmm maybe this is why Firefox is so damn slow on my raspberry pi

AlmightySnoo@lemmy.world on 05 Feb 2024 20:26 collapse

Hmm I don’t think it’s because of that feature, because it only runs when you explicitly ask it to translate a page for you. You should probably check your extensions, see if you have some redundant ones (a mistake people make is use multiple ad-blockers/anti-trackers, when just uBlock Origin + Firefox’s defaults are usually good enough).

Ephera@lemmy.ml on 05 Feb 2024 19:30 next collapse

Firefox has that already (without using an LLM). But yeah, it’s still another way this could be implemented or possibly improved.

Shurimal@kbin.social on 06 Feb 2024 13:53 collapse

Vivaldi has had local translation for about half a year now. No need for LLM for this feature.

Ephera@lemmy.ml on 05 Feb 2024 19:33 next collapse

Webpage authors use LLMs to generate extremely long articles, to make you scroll by ads for longer. You use LLMs in your browser to summarize those articles. The circle of life, or something.

Steve@communick.news on 05 Feb 2024 22:19 next collapse

I do this already. It’s great. Kagi has a browser plugin that does it.

jaybone@lemmy.world on 06 Feb 2024 01:11 next collapse

The circle of death.

akrot@lemmy.world on 06 Feb 2024 08:34 collapse

I stumbled upon a website through DDG, and after a long intro, the main section supposedly where the thing I was searching for had “Sorry I can’t fulfill your request right now”. Basically a fully generated page to match my search with some parasitic seo tactics. The web be chaging. Front page of DDG.

Hamartiogonic@sopuli.xyz on 06 Feb 2024 12:59 collapse

Reminds me of the amazon products titled: “I’m sorry but I cannot fulfill this request it goes against OpenAI use policy,”.

I think we’re way beyond the point of no return. The internet has been ruined for good.

Byter@lemmy.one on 05 Feb 2024 19:54 next collapse

I’d love a browser-embedded LLM that had access to the DOM.

“Highlight all passages that talk about yadda yadda. Remove all other content. Convert the dates to the ISO standard. Put them on a number line chart, labeled by blah.”

That’d be great UX.

Cqrd@lemmy.dbzer0.com on 06 Feb 2024 00:48 next collapse

Arc has an LLM that lets you replace your search functionality with search or ask, where if you type a question it tries to answer it based on the content on the page. Kinda close to what you’re talking about.

Arc is genuinely trying to use LLMs in their browser in interesting ways.

JoeyJoeJoeJr@lemmy.ml on 06 Feb 2024 01:03 next collapse

You are falling into a common trap. LLMs do not have understanding - asking it to do things like convert dates and put them on a number line may yield correct results sometimes, but since the LLM does not understand what it’s doing, it may “hallucinate” dates that look correct, but don’t actually align with the source.

Byter@lemmy.one on 06 Feb 2024 14:10 collapse

Thank you for calling that out. I’m well aware, but appreciate your cautioning.

I’ve seen hallucinations from LLMs at home and at work (where I’ve literally had them transcribe dates like this). They’re still absolutely worth it for their ability to handle unstructured data and the speed of iteration you get – whether they “understand” the task or not.

I know to check my (its) work when it matters, and I can add guard rails and selectively make parts of the process more robust later if need be.

daed@lemmy.world on 06 Feb 2024 01:08 collapse

That’s actually fascinating to think about. Would be a fun project to mash something like Blazor Server and an LLM together and allow users to just kindly ask to rewrite the DOM in plain English.

LibreFish@lemmy.world on 06 Feb 2024 01:18 collapse

Quick tool to summarize a page, proofread, or compare it to another source. Still needs a functioning human brain to separate the wheat from the chaff so to speak, but I could see a LLM (especially local) being useful in some ways.

I’m sure there are disabilities or unique use cases that could increase it’s usefulness, especially once they improve more.

wolfruff@pawb.social on 05 Feb 2024 17:48 next collapse

Yet another reason why I use Vivaldi over every other chromium fork.

sab@kbin.social on 05 Feb 2024 20:09 next collapse

Same. Which is to say I have it installed and boot it along with GNOME Web every time I need to check that my shitty web programming work outside of Gecko. Which is thankfully rare.

Vivaldi is nice though.

sebinspace@lemmy.world on 05 Feb 2024 22:30 collapse

Still a Chromium fork.

SpeechToTextCloud@discuss.tchncs.de on 05 Feb 2024 19:19 next collapse

Kind of ironic how they use an AI generated article image

soulfirethewolf@lemdro.id on 05 Feb 2024 21:04 collapse

Ummm… No?

depositphotos.com/…/thinking-hominoid-robot-analy…

Upload Date: Oct 9, 2022

dantheclamman@lemmy.world on 05 Feb 2024 21:08 collapse

Another cue is that the numbers aren’t gobbledigook

DingoBilly@lemmy.world on 05 Feb 2024 21:38 next collapse

Basically, there’s no commercial benefit at this stage.

Once there is benefit they will add it in.

Whole lot of article for nothing.

dantheclamman@lemmy.world on 05 Feb 2024 21:54 collapse

That’s not what they said at all.

DingoBilly@lemmy.world on 06 Feb 2024 00:08 collapse

Yeah they did, but with some nice words around it all.

Did you read the article? It’s literally that llms are not at a stage where they’re useful. And then they end that once they become useful they’ll look at adding them in.

And Vivaldi is a business like any other, so it’s ultimately commercially driven. If you think they’re just running the business for the sake of being good then you’re incredibly naive.

dantheclamman@lemmy.world on 06 Feb 2024 01:16 collapse

Blocked for poor reading comprehension

DingoBilly@lemmy.world on 06 Feb 2024 02:11 collapse

Lol. People are way too tribalistic about their favourite products.

I’m not even saying anything negative - adding AI to your product is not a big deal, and they’ll do it when it makes sense to. Not sure why you have an issue with that/don’t understand that’s their intent as it’s very clearly stated by them.

Literally here you go: "Despite all this, we feel that the field on machine learning in general remains an exciting one and may lead to features that are actually useful. In the future, we hope that it will allow us to bring good privacy-respecting features to our users with a focus on improving discoverability and accesibility. "

sebinspace@lemmy.world on 05 Feb 2024 22:29 collapse

LLMs are essentially confident-sounding lying machines with a penchant to occasionally disclose private data or plagiarise existing work. While they do this, they also use vast amounts of energy

Just described most people

SkyNTP@lemmy.ml on 05 Feb 2024 23:33 collapse

Coincidentally, also why I don’t care much for most social media content.

bamboo@lemm.ee on 06 Feb 2024 06:38 collapse

And yet here you are

Draupnir@lemmy.world on 06 Feb 2024 06:57 next collapse

Well I don’t know about you, but my mind goes to user-written instagram posts, Facebook posts, and tweets. You know, things like local moms groups circlejerking about toxins in foods etc etc

GiveMemes@jlai.lu on 06 Feb 2024 13:35 collapse

I think that’s just for a different sort of person. While there is some relatively high level discussion on lemmy we also like to circlejerk just about other things. Different strokes for different folks. The two services aren’t that different besides Lemmy being foss

CoggyMcFee@lemmy.world on 06 Feb 2024 14:03 collapse

They did say “most”, so I’m not sure what you’re trying to call out here