ollama proxy with free mode -> if you just need *some* llm backend to work i've made an ollama/openai proxy that pulls currently free models from openrouter and routes to them, if one fails the next one is hit.
you can use filters to say you only want mistral models out of the free models for example. It also supports paid models
https://github.com/aljazceru/ollama-free-model-proxy