is Ollama CLI only? they don't have a desktop UI?
Discussion
Correct. Very easy to use though.
https://ollama.com/blog/run-llama2-uncensored-locally but tl;dr: `ollama run llama2-uncensored`
You can use https://ollama-gui.vercel.app/
Cool!
https://lmstudio.ai/ is a good alternative with UI to use different models locally.
python cli
At least there’s a nice TUI: https://github.com/ggozad/oterm
i think they have a UI plugin, and you can run a gradio interface

