Spent two hours coding while on planes. Glad I can still be productive without AI, but man this slooooow…

Reply to this note

Please Login to reply.

Discussion

Can your laptop support local models? If so, some of the small models are great for simple coding tasks.

I have not played around with them yet. I am running a Ryzen 5 7640, so I wouldn’t expect too much from it. I’ll let you know once i tried it

I'm guessing that running a model would reduce battery life significantly. Might as well scrape Stack Overflow entirely and work with that offline.

So I have just tried ollama with deepseek and gpt-oss. It does work, but its super slow and maxes our the CPU super fast. I guess it's great for single questions, but I probably won't be super useful for vibing. Which is okay haha. I am fine writing some code by hand when on a plane

Agree. How did we ever get anything done before?

I can see it now: "You kinds have no idea ... back in my day, we had to code by hand!"

nostr:nevent1qvzqqqqqqypzph0s8t9gtt0q88n8gt2mau7lx5klrxws6v0z9wv93eld4pwt8wa7qqs2x0qjzmdkzqk826hp26c0lwdfat3hjp2hsr7mpa0jr48zhm793wcavp2jh