Ollama runs your model locally

This project is a code different from Ollama, it pretends it is ollama but uses models through Venice.

So any app that would talk to local model through ollama can use Venice instead.

Reply to this note

Please Login to reply.

Discussion

No replies yet.