Global Feed Post Login
Replying to Avatar Ivan

Good morning, Nostr. Who's running local LLMs? What are the best models that can run at home for coding on a beefy PC system? In 2026, I want to dig into local LLMs more and stop using Claude and Gemini as much. I know I can use Maple for more private AI, but I prefer running my own model. I also like the fact there are no restrictions on these models ran locally. I know hardware is the bottleneck here; hopefully, these things become more efficient.

69
Austin 2d ago

I use pinokio to run open webui/ollama. I’m not a coder, but for engineering I like deepseek and qwen

Reply to this note

Please Login to reply.

Discussion

No replies yet.