itโs just the openai api specification no need to connect to any openai servers
you can start the proxy using this command:
docker run -p 8080:8080 ghcr.io/routstr/proxy
check the .env.example in the github.com/routstr/proxy repo for all the configuration options
you need to set it up so it proxies to your local ollama server
you only need an openai api key if you want to proxy requests to their servers
let me know if youโre facing any issues!
From the proxy README: "The proxy implements a seamless eCash payment flow that maintains compatibility with existing OpenAI clients while enabling Bitcoin micropayments"
Whar are these OpenAI compatible clients (you can consider me as a AI beginner)?
How can I connect to the proxy when it's running to test incoming requests via routstr?
I also noticed that I had to set a nsec env variable (which is not mentioned in the .env.example).
for example any ai agent app is openai compatible eg cursor/cline/windsurf/goose
I use Goose (desktop and cli) so if Iโm correct I can configure routstr as a provider there?
yes you just need to change the base_url to https://api.routstr.com/v1 and the model to your desired model eg anthropic/claude-4-opus and the api key to a cashu token or a prefilled api key from chat.routstr.com (inside settings)
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
Thread collapsed
Thx for the quick reply ๐
I found that yes, but I do need to build to Docker image myself (using the docker compose file) as Im running this on arm hardware
Work in progress ;)
Thread collapsed