is Ollama CLI only? they don't have a desktop UI?

Reply to this note

Please Login to reply.

Discussion

Correct. Very easy to use though.

https://ollama.com/blog/run-llama2-uncensored-locally but tl;dr: `ollama run llama2-uncensored`

Cool!

https://lmstudio.ai/ is a good alternative with UI to use different models locally.

python cli

I couldn’t install on windows and mac doesn’t seem to have a gui

At least there’s a nice TUI: https://github.com/ggozad/oterm

i think they have a UI plugin, and you can run a gradio interface

ty!

Why do they need an email address? WTF

They like to send out a mailer!