Replying to Avatar jb55

on mac?

yes

Reply to this note

Please Login to reply.

Discussion

Just had a quick look into this and it seems possible to do this for free. Ie to run open-source models on the Mac like LLama 2, Mistral, or Phi-2 locally using Ollama.

No internet, no API keys, no limits and Apple Silicon runs them well.

you can even use dave with a setup like this. fully private local ai assistant that can find and summarize notes for you

nostr:note17c3zxygr2krkt90lyrvh5rxtfmnstkcpkjyxmkz5z3tleagc848qlfyu9m

Cool. I’m still learning. So much to play with!

Would 48 gb be sufficient?