Spent two hours coding while on planes. Glad I can still be productive without AI, but man this slooooow…
Discussion
Can your laptop support local models? If so, some of the small models are great for simple coding tasks.
I have not played around with them yet. I am running a Ryzen 5 7640, so I wouldn’t expect too much from it. I’ll let you know once i tried it
I'm guessing that running a model would reduce battery life significantly. Might as well scrape Stack Overflow entirely and work with that offline.
So I have just tried ollama with deepseek and gpt-oss. It does work, but its super slow and maxes our the CPU super fast. I guess it's great for single questions, but I probably won't be super useful for vibing. Which is okay haha. I am fine writing some code by hand when on a plane
Agree. How did we ever get anything done before?
I can see it now: "You kinds have no idea ... back in my day, we had to code by hand!"