It’s ironic that Apple always talks about privacy being important but then uses all the user data to train its models. You have to opt OUT.
Discussion
Dont trust, verify đź‘€
Meta is the worst with this, they are using all messenger conversations if you don't opt out PER conversation, this is not possible to do for group chats and there is no way to know if they haven't already scraped all your convos, opt out just makes you sure they are not using your new messages.
Wild
very few people truly understand this.
nostr:nevent1qqs846fcpqyr8wkatnwdt8q2234mxy555qwzsmpjdkcjq46dh6qqnrcjdjm75
What is even more ironic is that privacy-conscious users use Apple products.
I don’t think they train on private user data. For a bit when Apple was trying to get into the LLM game they had their own scraper running to get data.
They have settings that literally say allow Siri to learn from this app.
Oh I see what you mean. I assumed you meant data for Apple’s LLM training. But yes in iOS there’s the Siri Intents framework. AFAIK it’s all on device and not very invasive. It sounds good in theory—suggesting common actions in apps—but in practice for me at least it’s rarely useful.
I don’t trust it. Maybe it’s all on device… but I’d rather opt into it if I wanted to, not opt out.
Fair enough. It’s on-device so it doesn’t really concern me, and I doubt most people care either way, but it’s easy to disable if it really does freak you out. I would still count it as Apple going out of its way to keep private information on your own device rather than uploading it to some AWS bucket to log forever like most companies would.