Replying to Avatar S!ayer

what's the best LLM to run from your local machine without the need for a heavy GPU?

Llama et al. all seem to require a chucky GPU, but surely we're at the stage (3 years later) that we have some local LLMs?

Avatar
S!ayer 1y ago

This has been informative. Thanks all for the suggestions. Time to deep dive.

nostr:nevent1qvzqqqqqqypzpn0vcvwxm9qxaxnadvqxwsf25esan5cusq6u8lt9cp3sr5w2cwujqyfhwumn8ghj7mmxve3ksctfdch8qatz9uq3wamnwvaz7tmjv4kxz7fwdehhxarj9e3xzmny9uqzqwkwg4ge8qa5cq4xjp64mnj6x57fh2xyd00kqfj9xp8q53x0ulrxcm9vkn

Reply to this note

Please Login to reply.

Discussion

No replies yet.