I was just looking at setting up ollama today. Is this de wey?

Reply to this note

Please Login to reply.

Discussion

Meh. Depends on machine. Im running a M3 MBP and it's fast enough. ChatGPT is probably better but I don't use it

I tried local (Ollama/Llama) with Code plugins, but I'm finding cloud service like replit much better.

Whole codebase context and much faster.

Replit agent is nuts.