Thinking of building a server that can run a local AI. I'm not super interested in running something like stable diffusion. Do I still need a bunch of GPUs?
#asknostr
I just want to run a good LLM on my phone. I’ve tried a few options but we’re still early.
Please Login to reply.
No replies yet.