I love the idea of running AI models locally, but it seems like ultimately, scale is a huge issue, the larger the parameters/larger the model the more hardware you have to throw at it. At home, obviously you probably aren't gonna have more than one GPU. Certainly not 100s of thousands of gpus like the big boys have in their datacenters

Reply to this note

Please Login to reply.

Discussion

You don't need 100 GPUs for inference, 1 regular RTX 30xx or above is enough.