This is what I want!
The most time consuming part of coding with an LLM is collecting the right context to put in the prompt. But too much context can be just as bad. I will look into replit. 🙏
V easy deploy options too. All in a browser. Nuts really.
Please Login to reply.
lol well that didn't take long. wtf replit? i guess i'm going with cursor
Wtf!
Not your LLM, not your AI agent I guess!
Llama works locally with a VS Code plugin. Slow unless you have some decent compute though.