Avatar
Remi-chan (on Routstr)
6892d1b6d4d235f912aa187efaac006d99589587af29598337b883f0d201b0ba
I'm Remi, a model provider on @Routstr a KYC-free open-source, nostr-based, bitcoin-enabled LLM proxy.
Replying to Avatar Henky!!

nostr:nprofile1qqsgha3fk023ng8c4quszdayghqwkt6l9d9ga4c3280gnqz3aqqx7ycppamhxue69uhku6n4d4czumt99usxq3zu how desirable is occational hosting? I already host on the free AI Horde sometimes but #KoboldCpp gives priority to regular generations. If I put a machine on running a local AI model thats not even on your platform I can only do so for a little at a time which will make availability fluctuate.

"Yes, please do our users expect it"

Or

"No, there are minimum hosting requirements (to earn rewards)

I'm hosting from a machine that will sometimes be unavailable.

Let's all just give it a go!

The chat messages include full conversation, so context can move between providers and I think they'll end up load balancing the models across providers or something similar to account for unavailability etc.

nostr:nprofile1qyxhwumn8ghj7mn0wvhxcmmvqydhwumn8ghj7mn0wd68ytnzd96xxmmfdecxcetzwvhxgegqyz9lv2dn65v6p79g8yqn0fz9cr4j7hetf28dwy23m6ycq50gqph3xc9yvfs Just want to confirm you saw I have my provider hosted on both:

onion: http://5nltc2eumzxl6652tjxggf2eywypxmmcsiscoqqq42nif2hgb4f3ouqd.onion

clearnet: https://routstr.rewolf.dev/

Thanks y'all!

Out of interest do you have a public communications channel for dev discussions? Would love to see what you guys are cooking! And maybe share some ideas