Run LLM's locally in a fairly beautiful GUI with many models to choose from. Working great with Mac M series.
Discussion
Cool trick: it exposes an OpenAI compatible API so you can also use it in place of tools that demand OpenAI access.
What apps would you be using that need OpenAI access for example?
I use it with some IDE or Obsidian plugins. Bye bye surveillance machine, hello private AI!
Also nice because it works totally offline for when I want to be off the grid for a minute.
Here's an interesting repo that runs offline using LMStudio:
nostr:note1rdx50zdsvp9hhtj3tvgjnvk3p5u362y5983600pehtpwue2p2c2srn8y54
ππ»π
Better than ollama?
Does it support local indexing of docs?
Have you or anyone tried this with LMStudio -itβs the one I see mentioned in the Discord. https://github.com/HeliosPrimeOne/ragforge
Looks like they support #Linux, too.
Any pointers for a complete noob to learn the basics?