Want a more private ChatGPT alternative that runs offline? Check out Jan (bgr.com)
from throws_lemy@lemmy.nz to technology@lemmy.world on 20 Jan 2024 15:18
https://lemmy.nz/post/5777229

#technology

threaded - newest

PerogiBoi@lemmy.ca on 20 Jan 2024 16:33 next collapse

Also check out LLM Studio and GPT4all. Both of these let you run private ChatGPT alternatives from Hugging Face and run them off your ram and processor (can also offload to GPU).

tubbadu@lemmy.kde.social on 20 Jan 2024 16:36 next collapse

Are they as good as chatgpt?

PerogiBoi@lemmy.ca on 20 Jan 2024 16:38 next collapse

Mistral is thought to be almost as good. I’ve used the latest version of mistral and found it more or less identical in quality of output.

It’s not as fast though as I am running it off of 16gb of ram and an old GTX 1060 card.

If you use LLM Studio I’d say it’s actually better because you can give it a pre-prompt so that all of its answers are within predefined guardrails (ex: you are glorb the cheese pirate and you have a passion for mink fur coats).

There’s also the benefit of being able to load in uncensored models if you would like questionable content created (erotica, sketchy instructions on how to synthesize crystal meth, etc).

tsonfeir@lemm.ee on 20 Jan 2024 16:44 next collapse

I’m sure that meth is for personal use right? Right?

PerogiBoi@lemmy.ca on 21 Jan 2024 01:49 collapse

Absolutely. Synthesizing hard drugs is time consuming and a lot of hard work. Only I get to enjoy it.

tsonfeir@lemm.ee on 21 Jan 2024 02:22 collapse

No one gets my mushrooms either ;)

PerogiBoi@lemmy.ca on 21 Jan 2024 17:36 next collapse

brf tek says hi

tsonfeir@lemm.ee on 21 Jan 2024 17:49 next collapse

I just buy my substrate online. I’m far less experimental than most. I just want it to work in a consistent way that yields an amount I can predict.

What I really want to grow is Peyote or San Pedro, but the slow growth and lack of sun in my location would make that difficult.

PerogiBoi@lemmy.ca on 21 Jan 2024 23:53 collapse

Precolonized or just substrate? I wonder if a grow lamp would work. Pretty pricey though for electricity.

tsonfeir@lemm.ee on 22 Jan 2024 00:07 collapse

Just the substrate. I use the syringes I buy online because it’s the easiest method for me. I’m not actually sure pre-colonized would be “legal” for my “microscope” hobby. Ahem.

Mostly it’s just B+, which I enjoy and it’s simple to grow. Golden Teacher was also easy, but the yield was less.

PerogiBoi@lemmy.ca on 22 Jan 2024 00:17 collapse

I had such low yield with B+ or any other strains 🤪probs just my technique.

tsonfeir@lemm.ee on 22 Jan 2024 00:40 collapse

Well, low might be relative to your expectations. How many grams did you want vs what you got?

I see some people on Reddit with huge-ass yields. Like, fist-sized caps. wtf. Ultimately, I got the amount I wanted to use out of it.

PerogiBoi@lemmy.ca on 22 Jan 2024 15:08 collapse

I saw roughly 11 grams dried. Was expecting 3x the amount judging by everyone’s’ impressive grows haha.

tsonfeir@lemm.ee on 22 Jan 2024 18:03 collapse

Wow that is low. I got about 60g dry.

Did you have a pump keeping it moist?

PerogiBoi@lemmy.ca on 22 Jan 2024 21:13 collapse

No but I misted it and fanned it as was the style back then.

tsonfeir@lemm.ee on 22 Jan 2024 21:18 collapse

The good ole onion belt.

That’s probably why. I had a whole humidified thing and timed lights. It wasn’t expensive, and it came in a kit. Give it another shot.

Midwestgrowkits.com

PerogiBoi@lemmy.ca on 22 Jan 2024 21:31 collapse

Ooo pre-pasteurized manure bags with self healing injection holes. I’ve been using brown rice flour and coco coir 🤪

Rai@lemmy.dbzer0.com on 22 Jan 2024 00:31 collapse

Whazzat? I’ve only met uncle Ben!

PerogiBoi@lemmy.ca on 22 Jan 2024 15:07 collapse

Its the great grandfather of uncle Ben

Chee_Koala@lemmy.world on 22 Jan 2024 16:07 collapse

Even though growing mushrooms is almost the easiest thing to do on the planet

tsonfeir@lemm.ee on 22 Jan 2024 18:04 collapse

Exactly, grow your own.

timetravel@lemmings.world on 21 Jan 2024 06:00 collapse

Can you provide links for those? I see a few and don’t trust search results

Black_Gulaman@lemmy.dbzer0.com on 21 Jan 2024 06:39 next collapse

You can search inside LM studio for uncensored or roleplay. Select the size you want then it’s all good from there.

PerogiBoi@lemmy.ca on 21 Jan 2024 17:36 collapse

They’re the first results on all major search engines.

tsonfeir@lemm.ee on 20 Jan 2024 16:43 next collapse

No.

Hestia@lemmy.world on 22 Jan 2024 22:49 collapse

Depends on your use case. If you want uncensored output then running locally is about the only game in town.

Just_Pizza_Crust@lemmy.world on 20 Jan 2024 17:02 next collapse

I’d also recommend Oobabooga if you’re already familiar with Automatic1111 for Stable diffusion. I have found being able to write the first part of the bots response gets much better results and seems to make up false info much less.

FaceDeer@kbin.social on 20 Jan 2024 17:14 next collapse

There's also koboldcpp, which is fairly newbie friendly.

Turun@feddit.de on 21 Jan 2024 01:12 collapse

And llama file, which is a chat bot in a single executable file.

EarMaster@lemmy.world on 21 Jan 2024 12:16 collapse

I feel like you’re all making these names up…but they were probably suggested by a LLM all together…

webghost0101@sopuli.xyz on 21 Jan 2024 15:06 next collapse

Something i am really missing is a breakdown of How good these models actually are compared to eachother.

A demo on hugging face couldnt tell me the boiling point of water while the authors own example prompt asked the boiling point for some chemical.

MTK@lemmy.world on 22 Jan 2024 03:37 next collapse
TriPolarBearz@lemmy.world on 22 Jan 2024 16:32 collapse

Maybe you could ask for the boiling point of dihydrogen monoxide (DHMO), a very dangerous substance.

More info at DHMO.org

webghost0101@sopuli.xyz on 22 Jan 2024 16:56 collapse

I asked H2O first but no proper answer.

i heard dihydrogen monoxide has a melting point below room temperature and they seem to find it everywhere causing huge oxidation damage to our infrastructure, its even found inside our crops.

Truly scary stuff.

M500@lemmy.ml on 22 Jan 2024 05:42 collapse

I can’t find a way to run any of these on my homeserver and access it over http. It looks like it is possible but you need a gui to install it in the first place.

Emma_Gold_Man@lemmy.dbzer0.com on 22 Jan 2024 13:08 next collapse

ssh -X

Scipitie@lemmy.dbzer0.com on 22 Jan 2024 18:17 collapse

(edit: here was wrong information - I apologize to the OP!)

Plus a GUI install is not exactly the best for reproducability which at least I aim for with my server infrastructure.

Emma_Gold_Man@lemmy.dbzer0.com on 22 Jan 2024 23:31 collapse

You don’t need to run an X server on the headless server. As long as the libraries are compiled in to the client software (the GUI app), it will work. No GUI would need to be installed on the headless server, and the libraries are present in any common Linux distro already (and support would be compiled into a GUI-only app unless it was Wayland-only).

I agree that a GUI-only installer is a bad thing, but the parent was saying they didn’t know how it could be done. “ssh -X” (or -Y) is how.

Scipitie@lemmy.dbzer0.com on 23 Jan 2024 06:08 collapse

That’s a huge today-I-learned for me, thank you! I took ill throw xeyes on it just to use ssh - C for the first time in my life. I actually assumed wrong.

I’ll edit my post accordingly!

theterrasque@infosec.pub on 22 Jan 2024 15:57 collapse

Koboldcpp

tubbadu@lemmy.kde.social on 20 Jan 2024 16:35 next collapse

Is it as good as chatgpt?

Infiltrated_ad8271@kbin.social on 20 Jan 2024 17:32 next collapse

The question is quickly answered as none is currently that good, open or not.

Anyway it seems that this is just a manager. I see some competitors available that I have heard good things about, like mistral.

Bipta@kbin.social on 20 Jan 2024 18:17 collapse

Local LLMs can beat GPT 3.5 now.

Speculater@lemmy.world on 20 Jan 2024 19:17 next collapse

I think a good 13B model running on 12GB of VRAM can do pretty well. But I’d be hard pressed to believe anything under 33B would beat 3.5.

miss_brainfart@lemmy.ml on 22 Jan 2024 10:45 collapse

Asking as someone who doesn’t know anything about any of this:

Does more B mean better?

alphafalcon@feddit.de on 22 Jan 2024 12:23 collapse

B stands for Billion (Parameters) IIRC

june@lemmy.world on 21 Jan 2024 01:06 collapse

3.5 fuckin sucks though. That’s a pretty low bar to set imo.

Falcon@lemmy.world on 22 Jan 2024 05:08 collapse

Many are close!

In terms of usability though, they are better.

For example, ask GPT4 for an example of cross site scripting in flask and you’ll have an ethics discussion. Grab an uncensored model off HuggingFace you’re off to the races

tubbadu@lemmy.kde.social on 22 Jan 2024 09:31 collapse

Seems interesting! Do I need high end hardware or can I run them on my old laptop that I use as home server?

Falcon@lemmy.world on 22 Jan 2024 13:18 collapse

Oh no you need a 3060 at least :(

Requires cuda. They’re essentially large mathematical equations that solve the probability of the next word.

The equations are derived by trying different combinations of values until one works well. (This is the learning in machine learning). The trick is changing the numbers in a way that gets better each time (see e.g. gradient descent)

ripcord@lemmy.world on 22 Jan 2024 15:34 next collapse

How’s the guy who said he’s running off a 1060 doing it?

Chee_Koala@lemmy.world on 22 Jan 2024 16:14 collapse

Slowly

ripcord@lemmy.world on 22 Jan 2024 23:19 collapse

Then you don’t need a 3060 at least

tubbadu@lemmy.kde.social on 22 Jan 2024 16:18 collapse

Oh this is unfortunate ahahahaha
Thanks for the info!

stevedidWHAT@lemmy.world on 20 Jan 2024 16:55 next collapse

Open source good, together monkey strong 💪🏻

Build cool village with other frens, make new things, celebrate as village

Tja@programming.dev on 20 Jan 2024 19:38 next collapse

Apes together *

stevedidWHAT@lemmy.world on 21 Jan 2024 18:08 collapse

See case in point

Zeon@lemmy.world on 22 Jan 2024 00:58 collapse

It’s free / libre software, which is even better, because it gives you more freedom than just ‘open-source’ software. Make sure to check the licenses of software that you use. Anything based on GPL, MIT, or Apache 2.0 are Free Software licenses. Anyways, together monkey strong 💪

randon31415@lemmy.world on 20 Jan 2024 19:30 next collapse

I have recently been playing with llamafiles, particularly Llava which, as far as I know, is the first multimodal open source llm (others might exist, this is just the first one I have seen). I was having it look at pictures of prospective houses I want to buy and asking it if it sees anything wrong with the house.

The only problem I ran into is that window 10 cmd doesn’t like the sed command, and I don’t know of an alternative.

altima_neo@lemmy.zip on 20 Jan 2024 19:32 next collapse

Powershell, maybe?

ramjambamalam@lemmy.ca on 20 Jan 2024 19:46 next collapse

Would it help to run it under WSL?

halva@discuss.tchncs.de on 20 Jan 2024 22:25 next collapse

might be a good idea to use windows terminal or cmder and wsl instead of windows shells

Tja@programming.dev on 20 Jan 2024 19:37 next collapse

WSL2

Falcon@lemmy.world on 22 Jan 2024 05:06 next collapse

sd is written in rust and cross platform github.com/chmln/sd

Does awk run on windows?

randon31415@lemmy.world on 22 Jan 2024 06:06 collapse

Wait, can you just install sed?

Falcon@lemmy.world on 22 Jan 2024 13:20 collapse

If you can find a copy yeah. GNU sed isn’t written for windows but I’m sure you can find another version of sed that targets windows.

ripcord@lemmy.world on 22 Jan 2024 15:32 collapse

Install Cygwin and put it in your path.

You can use grep, awk, see, etc from either bash or Windows command prompt.

TootSweet@lemmy.world on 20 Jan 2024 20:41 next collapse

It seems like usually when an LLM is called “Open Source”, it’s not. It’s refreshing to see that Jan actually is, at least.

long_chicken_boat@sh.itjust.works on 22 Jan 2024 04:49 collapse

Jan is just a frontend. It supports various models under multiple licence. It also supports some proprietary models.

wetferret@lemmy.world on 20 Jan 2024 23:53 next collapse

I would also reccommend faraday.dev as a way to try out different models locally using either CPU or GPU. I believe they have a build for every desktop OS.

drislands@lemmy.world on 21 Jan 2024 00:23 next collapse

Sure, Jan.

blazeknave@lemmy.world on 22 Jan 2024 04:31 collapse

Marsha Marsha Marsha!

SpiceDealer@lemmy.world on 22 Jan 2024 05:51 next collapse

But is it FOSS?

FractalsInfinite@sh.itjust.works on 22 Jan 2024 05:56 next collapse

Meen Jan, the open-source ChatGPT alternative

Yes, the article is pretty good

downhomechunk@midwest.social on 22 Jan 2024 06:02 collapse

Sick burn

SpiceDealer@lemmy.world on 22 Jan 2024 06:36 collapse

And well deserved too. I’m not even mad. I really should real the actual article.

ripcord@lemmy.world on 22 Jan 2024 15:36 collapse

I really should real the actual article.

And proofread.

Fades@lemmy.world on 22 Jan 2024 08:36 collapse

It’s literally in the goddamn title, just click the fucking link Jesus Christ

Shit, don’t even have to click the link it’s in the fuckin url

ElPussyKangaroo@lemmy.world on 22 Jan 2024 11:19 next collapse

Any recommendations from the community for models? I use ChatGPT for light work like touching up a draft I wrote, etc. I also use it for data related tasks like reorganization, identification etc.

Which model would be appropriate?

[deleted] on 22 Jan 2024 12:52 next collapse

.

Falcon@lemmy.world on 22 Jan 2024 13:51 collapse

The mistral-7b is a good compromise of speed and intelligence. Grab it in a GPTQ 4bit.

ElPussyKangaroo@lemmy.world on 22 Jan 2024 18:32 collapse

Okie dokie.

leanleft@lemmy.ml on 26 Jan 2024 15:04 collapse

savedyouaclick jan.ai