I yolo bought a 64gb model, going to use it as my backup bitcoin node + large model inference tasks

Reply to this note

Please Login to reply.

Discussion

Brilliant🚀

M4 Pro is still a great deal if you want to swing the ~2M sats

Just realized you can get an m4 macbook with 128gb of unified ram. That would be wild for large model inference. A little more pricey but… hmm

70B LLM's would be easy to run...👀

🤔

Yeah you get 8tokens a second though

True

Need to max out the cores. Idk if a 14-core CPU is enough

A single core CPU is enough tbh

Surely there's more you could be doing to increase the attack surface