my take is that, unless you want to develop your own LLMs, I would wait for the consumer architecture to stabilize instead of being an early adopter and spending too much on a local rig.
training your own LLMs on your own specific dataset is the only usecase I would suggest building a local rig for
Please Login to reply.
i'd disagree, a lot of use cases are completely viable locally
Yes, I do a lot locally too.