But with open source at least someone (e.g. myself) can take what they’ve done and run it how I want. Although that brings it back to my problem of not having the right hardware currently
Discussion
You can run ollama today.
They have not done anything special for running the models. The models themselves are open source. The value of the service is that they run the models on their infrastructure. If you want to run it themselves, there's plenty of better solutions like ollama and many frontends such as open webui.
Yes I know venice.ai doesn't have any magic sauce. My main constraint is hardware. I have a home server but no capable GPU. A stop gap can be running locally on my Macbook but would love to have something that is accessible anywhere