"I think the snappily titled "gemma3:27b-it-qat" may be my new favorite local model - needs 22GB of RAM on my Mac (I'm running it via Ollama, Open WebUI and Tailscale so I can access it from my phone too) and so far it seems extremely capable"

https://fxtwitter.com/simonw/status/1913728553126175140

https://simonwillison.net/2025/Apr/19/gemma-3-qat-models/

Reply to this note

Please Login to reply.

Discussion

Mmm. Thanks will try running it on mine.

Have you tried codex yet?

https://github.com/ymichael/open-codex

This fork gets it to work with ollama

Looks interesting