I coded an app over 3 months that might be useful for some people. What are the next steps?
from yukaii@programming.dev to programming@programming.dev on 25 May 15:57
https://programming.dev/post/31005310

#programming

threaded - newest

yukaii@programming.dev on 25 May 16:12 next collapse

The goal was to build a “native” app that has a nice UI and can connect to all the different LLM providers. So basically you don’t have to pay a subscription fee and only pay as you go to the different providers.

The difference to existing apps like Typing Mind is that it actually mimics the UI from ChatGPT as close as possible. So you don’t feel like you’re using something inferior. What do you guys think?

<img alt="" src="https://programming.dev/pictrs/image/ab0e9a8e-c944-434d-96a1-aa961f77c3aa.png">

Lembot_0002@lemm.ee on 25 May 16:13 next collapse

And what does your program do?

yukaii@programming.dev on 25 May 16:15 collapse

Basically you can talk to different LLM models through their api. But all your conversation are saved locally on your computer so you don’t have to pay a subscription fee and you have all your chats in one place!

7uWqKj@lemmy.world on 25 May 20:51 collapse

So, something like Duck.ai, the built-in AI frontend of DuckDuckGo? duck.ai Nice, but why would anyone pay for it?

yukaii@programming.dev on 25 May 21:10 collapse

Yes similar but not quite the same. You can see the features here: https://anylm.app/

enemenemu@lemm.ee on 25 May 16:27 next collapse

I love the gtk ui. I wish it was a frontend for a local ai but I’m not complaining :) i love it still. Kudos!

yukaii@programming.dev on 25 May 16:53 collapse

Thank you! It could actually be a frontend for local ai. Currently only Openai, Anthropic and Google are supported. But I will be adding support for lmstudio, ollama etc in the future as well!

dr_robotBones@reddthat.com on 25 May 18:08 next collapse

Is it cross-platform?

yukaii@programming.dev on 25 May 19:07 collapse

Currently only Windows is supported. But because it’s flutter under the hood, I will be expanding to other platforms shortly!

pinguin@fault.su on 25 May 18:34 next collapse

Is it open source?

yukaii@programming.dev on 25 May 19:10 collapse

No, the application isn’t open source, primarily because I plan to actively support and maintain it for many years to come. But it’s a one-time payment so you don’t have to pay anything after that!

Adanisi@lemmy.zip on 25 May 19:25 collapse

You could distribute the code under a libre/open source license with the program after someone buys it. Open source doesn’t mean the code is public, just that the users have rights to access and change the code.

yukaii@programming.dev on 25 May 20:08 collapse

How would I go about doing that though? If I give the users an open source license on purchase, compilation and reselling will be pretty easy to do and I don’t want that to happen for something I worked very hard on.

Adanisi@lemmy.zip on 27 May 15:49 collapse

They could resell the binary anyways.

deadcatbounce@reddthat.com on 25 May 18:50 next collapse

I applaud this program, exactly what we’ve all been looking for.

Looks like many of the comments identifying it have been removed.

yukaii@programming.dev on 25 May 19:11 collapse

Thank you.

truxnell@aussie.zone on 25 May 21:08 next collapse

Looks a lot likehttps://github.com/open-webui/open-webui

I’m using it with open router, helps me claw back a little privacy

massi1008@lemmy.world on 26 May 15:17 collapse

Privacy & Local Storage

All your chats and data are stored locally on your device, ensuring complete privacy and control.

Assuming that those providers don’t store your chats (which they certainly do). Also no mention of locally hosted models?

If you just want to connext to llms then SillyTavern can also do that, for free.