Open source AI models being widely considered important is great, but what about affordable hardware that can do inference with a model where the bleeding edge is getting into the trillions of parameters? I’d like to see more interest and effort there.
Discussion
No replies yet.