Global Feed Post Login
Replying to Avatar bitcoiner7 nym

And I think, when you run ollama and oterm locally, you can use the models and have privacy.

At least I tested and it still worked with wifi turned off.

Of course it's not impossible that it is storing my prompts and sending them somewhere later once I'm back online.

But it's open source so I guess someone would find out.

Avatar
Byzantine 1y ago

yes i think running locally is a great option especially as home computing power grows

Reply to this note

Please Login to reply.

Discussion

No replies yet.