my take is that, unless you want to develop your own LLMs, I would wait for the consumer architecture to stabilize instead of being an early adopter and spending too much on a local rig.

Reply to this note

Please Login to reply.

Discussion

training your own LLMs on your own specific dataset is the only usecase I would suggest building a local rig for

i'd disagree, a lot of use cases are completely viable locally

Yes, I do a lot locally too.

its hard to have too much sovereign compute tho