I'm a total noob to locally-run LLMs. How do you "use" it locally? I remember someone posted a link to an application that you could use to load the dataset and chat with the bot but I can't remember what it was called.

Reply to this note

Please Login to reply.

Discussion

https://ollama.com is pretty good for desktop.

If you want to run one on iOS, there’s also https://apps.apple.com/gb/app/mlc-chat/id6448482937

Are there any apps to run LLMs locally that also offers a UI with formatted text output similar to OpenAI's ChatGPT web interface?

Thanks!

Np. I used it a bit. Personally what I’d like to find is a model I can run on a server in my house that the fam can connect to. Especially one that can do code suggestions.

GPT4All