Wtf!
Not your LLM, not your AI agent I guess!
Llama works locally with a VS Code plugin. Slow unless you have some decent compute though.
Please Login to reply.
No replies yet.