Open source AI models being widely considered important is great, but what about affordable hardware that can do inference with a model where the bleeding edge is getting into the trillions of parameters? I’d like to see more interest and effort there.

Reply to this note

Please Login to reply.

Discussion

No replies yet.