ebay server graphics card for self-hosted AI (ollama). 8GB Tesla P4 from 2016. $120. It takes one pcie x16 slot. Small enough to fit in a 1U server. Should be enough to get started.
ollama is really sick. It looks like ChatGPT's UI.
Please Login to reply.
That is sick
That's with Open WebUI, right?
Looks like that.
It has a simple API as well to develop against: https://github.com/ollama/ollama/blob/main/docs/api.md