what does ollama give me?

Reply to this note

Please Login to reply.

Discussion

It's an alternative to llama.cpp, the server running the actual models, it should be more user friendly and "plug and play" wrt llama.cpp.

from what I understand it just makes it easier to swap models, but it still uses llama.cpp under the hood.

Sorry, yeah it's more like a wrapper than an alternative.

Yes, but don't discount that value. I used to run llama.cpp on the command line because it's amazing that you can ./a-brain. These days I run ollama on the GPU computer and connect to it from my laptop or phone using Zed or Enchanted.

For me the best thing ollama gave me was the ability to easily pull different models from their library with ollama pull . Way easier than downloading them manually from somewhere and placing them in the right location.

yeah I can see that