Alpaca flatpack app
from MickeyMice@lemm.ee to linux@lemmy.ml on 28 Jun 21:12
https://lemm.ee/post/68103812

Im trying to figure out how online search funkcion works… Didnt have much luck for now. And also general discusion about the app would be wery helpfull for eweryone.

#linux

threaded - newest

SnotFlickerman@lemmy.blahaj.zone on 28 Jun 22:48 next collapse

github.com/Jeffser/Alpaca

This will probably help anyone unfamiliar with it, since the first search result for Alpaca AI is another online paid AI service which does something entirely different than this. It’s used for AI image generation.

The main question I have is since Ollama is optional… If you optionally use it, is it still sharing data with Facebook Meta?

MickeyMice@lemm.ee on 28 Jun 22:51 collapse

Didnt know that ollama is sharing data with facebook… Why would it do something like that? Wouldnt that be oposite of what it was created for and that is privacy… Where did you get that info?

SnotFlickerman@lemmy.blahaj.zone on 28 Jun 22:58 collapse

It looked like from comments that’s why he made the Ollama integration optional, because some people were concerned since Ollama was built by Meta. It can run without Ollama, it seems.

EDIT: Doing more research on Ollama itself, I’m unconvinced that it’s sharing any data, despite being built by Meta.

MickeyMice@lemm.ee on 28 Jun 23:31 collapse

I didnt know that ollama was built by meta, where did you find that out? Its also an open source project it shouldnt have malicios code like that…

spencer@lemmy.ca on 29 Jun 01:01 collapse

Meta trained and published the model but it’s an open model. I’m not an expert but I don’t believe it’s sharing data with Meta since it’s just the model they trained, you can download it and run it offline. You’re just using the output of all the training they did on your own compute.

MickeyMice@lemm.ee on 29 Jun 06:03 next collapse

So it doesnt have anything with ollama softvare, you can download any llm it doesnt have to be metas…

ctrl_alt_esc@lemmy.ml on 29 Jun 09:53 collapse

You’re talking about the llama models, not ollama.

astro_ray@piefed.social on 29 Jun 01:08 next collapse

There are still active accounts on lemm.ee?

I am not certain what you mean by online search function. It can connect to the internet but it doesn't exactly function like a search engine from what I can understand.

MickeyMice@lemm.ee on 29 Jun 06:11 collapse

<img alt="" src="https://lemm.ee/pictrs/image/160b1c5c-81eb-43a8-8f25-a60d2b2cb496.png">

Second opcion looks like exactly that…

vermaterc@lemmy.ml on 29 Jun 07:12 next collapse

Taking advantage of the fact that this thread became popular, question to all of you guys: do you recommend some other open source LLM front ends?

RmDebArc_5@sh.itjust.works on 29 Jun 11:47 next collapse

I have made good experiences with GPT4ALL

juipeltje@lemmy.world on 29 Jun 13:04 next collapse

So far i’ve really liked just using ollama in the terminal since it just spits out text anyway.

vermaterc@lemmy.ml on 29 Jun 13:08 collapse

ofc I could even send raw api requests, but sometimes it’s good to have a nice GUI that “just works”.

Specifically I’m looking for something that could handle not only text responses, but also attachments, speech recognition and MCP support.

juipeltje@lemmy.world on 29 Jun 16:18 collapse

Yeah in that case you probably want something else. So far i’ve only ever used it for text based questions. I think i remember seeing that there is also a webui out there but i don’t remember the name.

domi@lemmy.secnd.me on 29 Jun 14:39 next collapse

LM Studio is by far my favorite. Supports all GPUs out of the box on Linux and has tons of options.

vala@lemmy.world on 29 Jun 16:39 collapse

LM studio is not open source at all.

domi@lemmy.secnd.me on 29 Jun 17:43 collapse

Looks like you’re right.

I switched to it when Alpaca stopped working on AMD GPUs and was under the impression it is open source.

ozymandias117@lemmy.world on 30 Jun 02:29 next collapse

Depending on how you had it installed, Alpaca split support in the Flatpaks.

If you want AMD support, you need to install com.jeffser.Alpaca.Plugins.AMD

domi@lemmy.secnd.me on 30 Jun 07:36 collapse

Doesn’t work for me unfortunately, always falls back to CPU ever since the packages were split up.

sudo_halt@lemmygrad.ml on 30 Jun 08:29 collapse

The engine is, the UI isn’t. Easy to make the mistake.

isVeryLoud@lemmy.ca on 29 Jun 20:13 next collapse

I was using LibreChat for a while

Tundra@sh.itjust.works on 30 Jun 08:51 collapse
Teppichbrand@feddit.org on 29 Jun 07:27 next collapse

This is offtopic, but I switched from Alpaca to duck.ai. I try not to use AI too often and even though I like the idea to run it locally, duck.ai is way easier to use.

MickeyMice@lemm.ee on 29 Jun 10:34 collapse

I use it also some times but that is online ai and everything you use it for goes to someones servers so its not private. Alpaca is using ollama to run ai localy on your machine so everything you use it for is private. So those two are completly diferent and are not for comparing which one is easier to use because ease of use is not the point here. Privacy is.

Shape4985@lemmy.ml on 29 Jun 07:29 collapse

I used alpaca but they made some changes recently that made it confusing and a pain to use. I deleted it after that as i dont use ai much anyway.