Replying to Avatar S!ayer

what's the best LLM to run from your local machine without the need for a heavy GPU?

Llama et al. all seem to require a chucky GPU, but surely we're at the stage (3 years later) that we have some local LLMs?

Avatar
atyh 1y ago

TinyBERT

Reply to this note

Please Login to reply.

Discussion

No replies yet.