Global Feed Post Login
Replying to Avatar Guy Swann

Holy shit… it costs $10K but you can get a Mac Studio (the little computer) with as much as 512GB of unified memory. Thats right, that’s RAM that can be used for vRAM. Meaning you can run natively the largest DeepSeek and Llama models with tons of room to spare on this single device.

The acceleration of hardware toward Ai optimization is going to be crazy. I get the sense we will see double, triple, and quadruple the vRAM equivalent (though it’ll all go unified) in just the next couple of years of product iterations.

Running huge LLMs and video/image models locally will get easier and easier.

24
24536e12... 9mo ago

I'm not paying 10k for ram buddy .

Reply to this note

Please Login to reply.

Discussion

Avatar
Guy Swann 9mo ago

The RAM is $4,800, the whole thing costs $10K. And if you were looking for something that can be used as vRAM for Ai, that’s probably the lowest price you can get anywhere

Thread collapsed