Global Feed Post Login
Replying to Avatar calle

"I think the snappily titled "gemma3:27b-it-qat" may be my new favorite local model - needs 22GB of RAM on my Mac (I'm running it via Ollama, Open WebUI and Tailscale so I can access it from my phone too) and so far it seems extremely capable"

https://fxtwitter.com/simonw/status/1913728553126175140

https://simonwillison.net/2025/Apr/19/gemma-3-qat-models/

94
Stvu 8mo ago

Have you tried codex yet?

https://github.com/ymichael/open-codex

This fork gets it to work with ollama

Reply to this note

Please Login to reply.

Discussion

Avatar
calle 8mo ago

Looks interesting

Thread collapsed