This is what I want!

The most time consuming part of coding with an LLM is collecting the right context to put in the prompt. But too much context can be just as bad. I will look into replit. 🙏

Reply to this note

Please Login to reply.

Discussion

V easy deploy options too. All in a browser. Nuts really.

lol well that didn't take long. wtf replit? i guess i'm going with cursor

Wtf!

Not your LLM, not your AI agent I guess!

Llama works locally with a VS Code plugin. Slow unless you have some decent compute though.