Anybody able to get new Llama models working on Start9 with FreeGPT2? I downloaded it but getting this error

Ollama: 500, message='Internal Server Error', url=URL('http://localhost:11434/api/chat')

Reply to this note

Please Login to reply.

Discussion

Is freegpt2 a fork of ollama?

Otherwise I wonder if Ollama is running in order to get that error

Im not sure. I think it acquires models from Ollama to integrste into the chat functionality.

It looks like freegpt2 is Ollama and Open Web UI. A quick peek at Ollama issues and I found one dealing with version problems and docker:

https://github.com/ollama/ollama/issues/5892