And I think, when you run ollama and oterm locally, you can use the models and have privacy.

At least I tested and it still worked with wifi turned off.

Of course it's not impossible that it is storing my prompts and sending them somewhere later once I'm back online.

But it's open source so I guess someone would find out.

Reply to this note

Please Login to reply.

Discussion

yes i think running locally is a great option especially as home computing power grows