we just spend 120k SATs to create this new cashu nip60 implementation in python
120k SATs because claude4 opus in cursor costs a lot
we needed this internally to improve the proxy implementations so why not publish it as separate library
this is the first version version of it with which we want to experiment to onboard providers
Its not perfect but the MVP of the marketplace
we needed to implement new restrictions to tge proxy to not lose money so there is now a minimum sat size requirement depending on the model you wanttoo use but we haven't fixed this in the frontend yet so sorry for that but should be fixed asap
nostr:npub16g4umvwj2pduqc8kt2rv6heq2vhvtulyrsr2a20d4suldwnkl4hquekv4h and nostr:npub130mznv74rxs032peqym6g3wqavh472623mt3z5w73xq9r6qqdufs7ql29s can we get a search box for all the models? or the ability the set some as favorites? I use the web interface a lot and I want to use different models for search / code but its difficult to switch between them
Just added it in our to do list I'll be available asap
One single request to Claude 4 Opus can cost $5,4 at context and completion maxed out. (~5k sats)
Before the request there is no way to estimate the cost so we are required to implement a minimum balance that you need to put in your cashu token to be able to use the model.
This cost varies by model so for example
Gemini 2.5 Flash has a maximum cost of
$0.2 ~180 sats
Our public onion endpoint:
http://mcfftxiqtpwvrsly4..titn3frmf55frcleid.onion
If you want to host your own provider and appear in the marketplace just post a nostr message, mention nostr:npub130mznv74rxs032peqym6g3wqavh472623mt3z5w73xq9r6qqdufs7ql29s and add your onion url like in this message. We have a script in our backend that searches nostr for these messages and then includes your provider.
We launch the marketplace soon then you can use your nostr social graph to determine which provider you want to trust
if it works with openai you just need to change 2 environment variables
you can just give it an api call
curl -X POST api.routstr.com/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer cashuYourValidToken" -d '{"model": "deepseek-ai/DeepSeek-R1", "messages": [{"role": "user", "content": "Hello Nostr"}]}'
Do you want to chat on my AI podcast, The Acceleration? https://fountain.fm/show/VP1yXMJeQCP4Ho3kYL1C
Let’s do it!
Of course everything is fully open source and it will be the same in future.
We want to build everything decentralized so everyone will be able to hoste their llm servers, their own marketplace, their own evaluations service, their own chat clients, …
At routstr were building the first marketplace to buy and sell LLM APIs using cashu tokens.
Using our proxy you can monetize any OpenAI API compatible LLM server on a per token basis at msat precision (e.g. your local hosted LlamaCPP server)
As User you can just put a cashu note into the api-key field of your llm application and as long as the balance lasts, you can leave the same note.
We dont have everything build out yet and soon we’re getting the marketplace onto nostr so people can list their own nodes.
To then ensure trust and quality we are working on a separate evaluation system so you know which nodes provide a good service.
A bitcoin technology

checkout our github!


