You can run ollama today.

https://ollama.com/

They have not done anything special for running the models. The models themselves are open source. The value of the service is that they run the models on their infrastructure. If you want to run it themselves, there's plenty of better solutions like ollama and many frontends such as open webui.

Reply to this note

Please Login to reply.

Discussion

Yes I know venice.ai doesn't have any magic sauce. My main constraint is hardware. I have a home server but no capable GPU. A stop gap can be running locally on my Macbook but would love to have something that is accessible anywhere