It will be really funny if I have been messing with this for more than a month only to have it need /v1/ 😂😂
Thank you. 🙏
Howdy! Are you using something like Ollama? You should be able to select 'OpenAI compatible' as an option for 'stacks configure' and point an OpenAI-like endpoint to it. In Ollama's case, this should be http://localhost:11434/v1/
More information on this can be found here:
https://github.com/ollama/ollama/blob/main/docs/openai.md
Let me know if you have any questions!
It will be really funny if I have been messing with this for more than a month only to have it need /v1/ 😂😂
Thank you. 🙏
No replies yet.