Running Ollama with DeepSeek-R1

https://shares.sebastix.dev/NArB4IlJ.mp4

It thinks / writes just as fast as I can read ๐Ÿ˜œ

Next: try to set this up as a provider with nostr:npub130mznv74rxs032peqym6g3wqavh472623mt3z5w73xq9r6qqdufs7ql29s proxy ๐Ÿ‘€ but is it correct I do need an OpenAI API key? /cc nostr:npub18gr2m5cflkzpn6jdfer4a8qdlavsn334m9mfhurjsge08grg82zq6hu9su

nostr:nevent1qqsflw568sw3hgcn586mtketstu2cmmzyjtue0657yvmvfnyd8ecg7szyqrx8x3cdjwpq9ppwc3ve085pyyvfudqcvlz87xk668540m9t78hzqcyqqqqqqgpr9mhxue69uhhwmm59eek2cnpwd6xj7pwwdhkx6tpdsykfckx

Reply to this note

Please Login to reply.

Discussion

itโ€™s just the openai api specification no need to connect to any openai servers

you can start the proxy using this command:

docker run -p 8080:8080 ghcr.io/routstr/proxy

check the .env.example in the github.com/routstr/proxy repo for all the configuration options

you need to set it up so it proxies to your local ollama server

you only need an openai api key if you want to proxy requests to their servers

let me know if youโ€™re facing any issues!

From the proxy README: "The proxy implements a seamless eCash payment flow that maintains compatibility with existing OpenAI clients while enabling Bitcoin micropayments"

Whar are these OpenAI compatible clients (you can consider me as a AI beginner)?

How can I connect to the proxy when it's running to test incoming requests via routstr?

I also noticed that I had to set a nsec env variable (which is not mentioned in the .env.example).

for example any ai agent app is openai compatible eg cursor/cline/windsurf/goose

I use Goose (desktop and cli) so if Iโ€™m correct I can configure routstr as a provider there?

yes you just need to change the base_url to https://api.routstr.com/v1 and the model to your desired model eg anthropic/claude-4-opus and the api key to a cashu token or a prefilled api key from chat.routstr.com (inside settings)

Thx for the quick reply ๐Ÿ™

I found that yes, but I do need to build to Docker image myself (using the docker compose file) as Im running this on arm hardware

Work in progress ;)