I have not played around with them yet. I am running a Ryzen 5 7640, so I wouldn’t expect too much from it. I’ll let you know once i tried it
Discussion
I'm guessing that running a model would reduce battery life significantly. Might as well scrape Stack Overflow entirely and work with that offline.
So I have just tried ollama with deepseek and gpt-oss. It does work, but its super slow and maxes our the CPU super fast. I guess it's great for single questions, but I probably won't be super useful for vibing. Which is okay haha. I am fine writing some code by hand when on a plane