"Slack trains machine-learning models on user messages, files and other content without explicit permission. The training is opt-out, meaning your private data will be leeched by default."

Neat.

It seems as if the AI race is all about building models as fast as possible, despite user privacy. Essentially "fuck users, we gotta beat the competition."

https://www.engadget.com/yuck-slack-has-been-scanning-your-messages-to-train-its-ai-models-181918245.html

Reply to this note

Please Login to reply.

Discussion

fuckers should just join huggingface community

"fuck users... silently... otherwise they may opt out which hurts the bottom line, we gotta beat the competition"

If I had hundreds of millions in risk I might say the same thing. I hope not, but you can't blame animals for following incentives. Shits just upside down. To extend this, its always been "fuck users silently otherwise they'll cause a fuss which will hurt the bottom line" it's not new, I just hope it opens more peoples eyes to how much "fuck the user" exists and always has.

Also I'm pretty sure something recently leaked about discord doing something similar, without the ability to opt out.

Oh really? I mean I'm not surprised. Many companies are probably doing this.

run local models only if you like your privacy bro.

Interesting that the opt-out is at the server level, not individual user

Yeah, that's the really shitty part.