I’m convinced the pitiful amount of VRAM we get in consumer GPUs is a scam to get everyone having to use the corporations’ cloud servers.

Reply to this note

Please Login to reply.

Discussion

You can just buy an on-prem server. With some dirty fiat.

I’d like to have a local computer that’s capable of running the most useful ai software, but it seems you get into the $10k+ cost to get a GPU with greater than 32GB of VRAM.

Imo the Mac minis with 64gb shared ram are great for home. Yes shared ram is not 100% as fast as vram, but it perfectly does the job at home. Something around 2-2.5k.

Yeah I’ve been considering getting something like that. AMD is also coming out with some shared memory mini PCs, I think with 128 GB.