Not your LLM, not your AI agent I guess!

Llama works locally with a VS Code plugin. Slow unless you have some decent compute though.

Reply to this note

Please Login to reply.

Discussion

No replies yet.