Global Feed Post Login
Replying to Avatar SUPERMAX

I prefer to run LLM's locally. Best to get started is run the Llama or Minstral 7b/8b models (at whatever size your RAM is, so 8b parameter model can be run on ~8GB RAM device)

If wanting something easier, free version of Claude is probably best rn imo, but the landscape changes so fast; literally day-by-day

Avatar
cornman 1y ago

appreciate it!

Reply to this note

Please Login to reply.

Discussion

No replies yet.