Have you tried codex yet?
https://github.com/ymichael/open-codex
This fork gets it to work with ollama
"I think the snappily titled "gemma3:27b-it-qat" may be my new favorite local model - needs 22GB of RAM on my Mac (I'm running it via Ollama, Open WebUI and Tailscale so I can access it from my phone too) and so far it seems extremely capable"
Have you tried codex yet?
https://github.com/ymichael/open-codex
This fork gets it to work with ollama
Looks interesting