Ollama runs your model locally
This project is a code different from Ollama, it pretends it is ollama but uses models through Venice.
So any app that would talk to local model through ollama can use Venice instead.
Ollama runs your model locally
This project is a code different from Ollama, it pretends it is ollama but uses models through Venice.
So any app that would talk to local model through ollama can use Venice instead.
No replies yet.