Run LLM's locally in a fairly beautiful GUI with many models to choose from. Working great with Mac M series.

https://lmstudio.ai/

Reply to this note

Please Login to reply.

Discussion

Cool trick: it exposes an OpenAI compatible API so you can also use it in place of tools that demand OpenAI access.

What apps would you be using that need OpenAI access for example?

I use it with some IDE or Obsidian plugins. Bye bye surveillance machine, hello private AI!

Also nice because it works totally offline for when I want to be off the grid for a minute.

Here's an interesting repo that runs offline using LMStudio:

https://github.com/KillianLucas/open-interpreter

nostr:note1rdx50zdsvp9hhtj3tvgjnvk3p5u362y5983600pehtpwue2p2c2srn8y54

πŸ™πŸ»πŸ˜˜

Better than ollama?

Does it support local indexing of docs?

Have you or anyone tried this with LMStudio -it’s the one I see mentioned in the Discord. https://github.com/HeliosPrimeOne/ragforge

Looks like they support #Linux, too.

https://lmstudio.ai/beta-releases.html#linux-beta

Any pointers for a complete noob to learn the basics?