V easy deploy options too. All in a browser. Nuts really.

Reply to this note

Please Login to reply.

Discussion

lol well that didn't take long. wtf replit? i guess i'm going with cursor

Wtf!

Not your LLM, not your AI agent I guess!

Llama works locally with a VS Code plugin. Slow unless you have some decent compute though.