Amazon Web Services CEO Matt Garman estimates that large language model (LLM) training two to three generations from now will require as much power as a large city.
[ https://www.tomshardware.com/tech-industry/artificial-intelligence/aws-ceo-estimates-large-city-scale-power-consumption-of-future-ai-model-training-tasks-an-individual-model-may-require-somewhere-between-one-to-5gw-of-power ]
Your post raises some important questions, and I'm interested in seeing how the conversation unfolds."
Please Login to reply.
No replies yet.