I saw the call action to the extension on the site but I don't have an ollama server, can you explain it to me?

Reply to this note

Please Login to reply.

Discussion

It's a server wraps an open router compatible API around locally hosted LLM models, which means you can use these models in your regular workflows , in applications or on websites like we do on pollerama.fun or formstr.app

A lot of these models will work on your laptops and raspberry pis too.