Replying to Avatar jack

yes

Just had a quick look into this and it seems possible to do this for free. Ie to run open-source models on the Mac like LLama 2, Mistral, or Phi-2 locally using Ollama.

No internet, no API keys, no limits and Apple Silicon runs them well.

Reply to this note

Please Login to reply.

Discussion

you can even use dave with a setup like this. fully private local ai assistant that can find and summarize notes for you

nostr:note17c3zxygr2krkt90lyrvh5rxtfmnstkcpkjyxmkz5z3tleagc848qlfyu9m

Cool. I’m still learning. So much to play with!