lol well that didn't take long. wtf replit? i guess i'm going with cursor

lol well that didn't take long. wtf replit? i guess i'm going with cursor

Wtf!
Not your LLM, not your AI agent I guess!
Llama works locally with a VS Code plugin. Slow unless you have some decent compute though.
Cursor is nice but it sucks when the free trial runs out and it refuses to play nice with other LLMs. Back to square one I guess.
nostr:note1yc2zh4qjssn4fgthx44sezmk5a05qfaexe30v2dna073sq76n74qagf4t0