That is to train them. You need more like a few thousand dollars of graphics cards to max out LLM performance for a single user running today's existing models. And we're starting to get pretty useful stuff in the 4-32GB range (typical consumer devices)

Reply to this note

Please Login to reply.

Discussion

Meant 4-32GB of memory, forgot to specify

Yeah that's more what I assumed it would be like by now.