I'm a total noob to locally-run LLMs. How do you "use" it locally? I remember someone posted a link to an application that you could use to load the dataset and chat with the bot but I can't remember what it was called.
Probably Mistral-7B, but keep an eye on https://chat.lmsys.org/?leaderboard for high ranking small open source models
Discussion
https://ollama.com is pretty good for desktop.
If you want to run one on iOS, there’s also https://apps.apple.com/gb/app/mlc-chat/id6448482937
Are there any apps to run LLMs locally that also offers a UI with formatted text output similar to OpenAI's ChatGPT web interface?
Thanks!
Np. I used it a bit. Personally what I’d like to find is a model I can run on a server in my house that the fam can connect to. Especially one that can do code suggestions.
Yeah that would be nice!
Looks like it runs on StartOS
Yes ollama or
https://github.com/ggerganov/llama.cpp
or
https://huggingchat-chat-ui.hf.space/chat/
you can try as guest
GPT4All