And I think, when you run ollama and oterm locally, you can use the models and have privacy.
At least I tested and it still worked with wifi turned off.
Of course it's not impossible that it is storing my prompts and sending them somewhere later once I'm back online.
But it's open source so I guess someone would find out.
yes i think running locally is a great option especially as home computing power grows
Please Login to reply.
No replies yet.