Facebook has been doing this for two decades.

nostr:nevent1qqstqpltfa0wgzxqqxeef6eufztrd5568p86u7drl3wz5h5gzd3vjkspz3mhxw309ucnydewxqhrqt338g6rsd3e9upzq5xeflpdskqvdq4swxj59793uvdzqzc9pzatjk3nhmcg2h0js8trqvzqqqqqqyncgze8

Reply to this note

Please Login to reply.

Discussion

this is the entire business model of https://monkeytype.com/

This only happens if you use it on some cloud server. How should this happen if I use it on my mats/ollama instance.

It doesn't. Only collects data and censors the output if you use their native app. This whole drama is nothing but fear mongering.

I'm running it with deep infra and the replies I am getting are not censored and clearly there is no data collection.

The cencoring isn't even built in, that's why you see it give a response for a few seconds and then being cut off by probably a secondary model whose job is to censor the output.

Which is hilariously inept.

Agreed. It does reason well and seems to be pretty balanced in it's reasoning. Cutoff happen but it doesn't seem to change the general meaning of the reply.

It can't, so long as your local instance isn't connecting to the Internet or you've reviewed their entire codebase for data leakage.

What database? Weights? Why would it matter if the UI is connected to the internet?

Yes. It's just a model. It can't magically do surveillance things that all other models can't do.

Ofc it can't you're right

yup. #Nostr

This feels a lot like (China + AI) = Money Printer