Global Feed Post Login
Replying to Avatar Guy Swann

Holy shit… it costs $10K but you can get a Mac Studio (the little computer) with as much as 512GB of unified memory. Thats right, that’s RAM that can be used for vRAM. Meaning you can run natively the largest DeepSeek and Llama models with tons of room to spare on this single device.

The acceleration of hardware toward Ai optimization is going to be crazy. I get the sense we will see double, triple, and quadruple the vRAM equivalent (though it’ll all go unified) in just the next couple of years of product iterations.

Running huge LLMs and video/image models locally will get easier and easier.

Avatar
Guy Swann 9mo ago

Btw every indication suggest this could run ChatGPT as well. I don’t think we know exactly how much RAM is required, but considering the comparative models, seems like a very safe bet that you could run their best model natively on this machine.

Crazy

Reply to this note

Please Login to reply.

Discussion

No replies yet.