Never seen that before.
But I mean running a literal AI server out of my house. Because I'm an insane person who likes spending a buttload of money and time in pursuit of digital sovereignty.
Never seen that before.
But I mean running a literal AI server out of my house. Because I'm an insane person who likes spending a buttload of money and time in pursuit of digital sovereignty.
Ollama is good. Many good models.
Dead easy to install and use.
Only downside: you'll need a good Nvidia card.
Yeah I've had an idea to build a dedicated system with 3 or 4 4090's for things like stable diffusion. But haven't been able to justify the 💰 to myself yet
I tried 3 different Ollama docker containers and none of them can download the models... Ill have to find some time to try again