This only happens if you use it on some cloud server. How should this happen if I use it on my mats/ollama instance.

Reply to this note

Please Login to reply.

Discussion

It doesn't. Only collects data and censors the output if you use their native app. This whole drama is nothing but fear mongering.

I'm running it with deep infra and the replies I am getting are not censored and clearly there is no data collection.

The cencoring isn't even built in, that's why you see it give a response for a few seconds and then being cut off by probably a secondary model whose job is to censor the output.

Which is hilariously inept.

Agreed. It does reason well and seems to be pretty balanced in it's reasoning. Cutoff happen but it doesn't seem to change the general meaning of the reply.

It can't, so long as your local instance isn't connecting to the Internet or you've reviewed their entire codebase for data leakage.

What database? Weights? Why would it matter if the UI is connected to the internet?

Yes. It's just a model. It can't magically do surveillance things that all other models can't do.

Ofc it can't you're right