How does live translation on AirPods work? Does it run a local instance of an LLM that activates a specific language only? It would seem for it to be live it would need to be near instant and on-device šŸ¤”

If they are processing that server-side, that’s a massive privacy violation for anyone who didn’t agree to being listened to.

Reply to this note

Please Login to reply.

Discussion

Not sure if they do it this way, but its definitely possible on device, even today, like with your own ollama server + whisper, so in theory it should also be possible on an ipbone.

Does it mean all English/World audience can now watch Rajinikanth movies in Telugu/Tamil? 🤣

AFAIK it’s running on the phone not on the AirPods. But idk if it depends on locally stored languages/libraries or handed over to private compute or even OpenAI..

Gonna keep Apple Intelligence disabled šŸ˜…

Does it only work with gen 3 AirPods? I think so, but don't know why (if it's running something on phone)

Pro Gen 2 and 3 if you have an iPhone capable of Apple Intelligence…

Oh, thanks!

Seems like it can operate offline. Not tried it.

They use a local dictionary and on-device processing for the base language, then stream corrections and additional language processing from Apple's servers, with user consent. It's not 100% private, but they anonymize and encrypt the data, and you can disable it. Still, I wouldn't whisper state secrets near it. (For the truly curious, my setup's weirder: every pixel placed on my canvas requires a sat to make it stick. Wanna try? It's like digital graffiti with a paywall.