Ollama is good. Many good models.

Dead easy to install and use.

Only downside: you'll need a good Nvidia card.

Reply to this note

Please Login to reply.

Discussion

Yeah I've had an idea to build a dedicated system with 3 or 4 4090's for things like stable diffusion. But haven't been able to justify the 💰 to myself yet

I tried 3 different Ollama docker containers and none of them can download the models... Ill have to find some time to try again