Can we download and run our own AI now?
If you have 160 GB of VRAM, yes!
Please Login to reply.
That sounds doable or?
Certainly for some folks. Definitely need some solid hardware. Running locally is super easy with ollama and OpenWebUI