What is considered the best LLM setup to run locally on my M1 Macbook? #askNostr
Discussion
cc nostr:npub1xtscya34g58tk0z605fvr788k263gsu6cy9x0mhnm87echrgufzsevkk5s , I think I saw you talking about this a couple of weeks ago or so?
Probably Mistral-7B, but keep an eye on https://chat.lmsys.org/?leaderboard for high ranking small open source models
I'm a total noob to locally-run LLMs. How do you "use" it locally? I remember someone posted a link to an application that you could use to load the dataset and chat with the bot but I can't remember what it was called.
https://ollama.com is pretty good for desktop.
If you want to run one on iOS, there’s also https://apps.apple.com/gb/app/mlc-chat/id6448482937
Are there any apps to run LLMs locally that also offers a UI with formatted text output similar to OpenAI's ChatGPT web interface?
Thanks!
Np. I used it a bit. Personally what I’d like to find is a model I can run on a server in my house that the fam can connect to. Especially one that can do code suggestions.
Yeah that would be nice!
Looks like it runs on StartOS
Yes ollama or
https://github.com/ggerganov/llama.cpp
or
https://huggingchat-chat-ui.hf.space/chat/
you can try as guest
GPT4All
WizardLM 13B or Nous Hermes Llama2 13B. GPT4All gives some more options.