It's a server wraps an open router compatible API around locally hosted LLM models, which means you can use these models in your regular workflows , in applications or on websites like we do on pollerama.fun or formstr.app

Reply to this note

Please Login to reply.

Discussion

A lot of these models will work on your laptops and raspberry pis too.

I will try to make the account.

Is something similiar to routstr? i know only the name. https://www.routstr.com/

No these models work on your device itself, they don't need an internet connection at all.

What model works? I try gemma3...

Gemma3 would work

I personally use gemma3 as well

good, I tried but I couldn't make it work

Where did you face a roadblock? What error are you getting?

Probably cors "Ollama API Error: 403 - "

Yes, you need to do an `export ALLOW_ORIGINS=*` before you run the server, you can edit the service if you're on Linux

I downloaded another extension "page assist ... " and it seems to work