nostr:npub1lj3lrprmmkjm98nrrm0m3nsfjxhk3qzpq5wlkmfm7gmgszhazeuq3asxjy I’m running llama2 and code lama locally on my laptop. Lot of fun. I think only the 7b models. Wonder if I could run 13b I have 24 gb ram.
Really want to be able to feed it docs pdfs etc. currently only runn inch in command line via Ollama