Nice. Would regular users put the API end point address into settings?

Reply to this note

Please Login to reply.

Discussion

yeah ux would be just select ollama provider since it is a standard endpoint

What model are you running and what’s the performance like on the Mac?

I’m using deepseek-r1:14B at the moment but need to try more out. Currently running it concurrently against GPT-4o (which queries OpenAI API) in Open Web UI to compare output (which is a nice feature) I think I need a bigger GPU!

Also been building a custom model (prompt) for Nini.