Have you tried smaller models like Ollama for specific tasks?
Discussion
Yeah, I’ve got llama 3 running via ollama. Downloading the bigger version now. Should be able to run it once my new RAM arrives.
But mostly, I’m thinking about going forward. I plan to download and try out a whole bunch of different models all of which are extraordinarily large data files. This seems like a perfect job for BitTorrent.