Federated inference and learning
Discussion
I’m loosely familiar with the concept. What’s the state of the art? Apple’s already doing this, right?
Read this and then think about what Apple have put in place.
https://damus.io/note1xykv3a02p2s0es4mnjt38txrns6h232y6jk6z2uu6wkzuwag2ggsgmsy9k
What do you see Apple having put in place relative to the Oppenheimer moment of FOSS AI?
Apple doesnt want to win but rather be the best they can be for their market. They are featured in the book, The infinite game by Simon Sinek. I liked microsoft better because Im not a very good teacher, which was Apple's chosen market
Also this: https://youtu.be/l7tWoPk25yU
I missed this news.
But it adds weight to Gary Kasparov position that human + machine is the killer combo.
I’m not worried about AI dominating humanity through bad intentions / actions.
If we live in a world with good and bad humans plus good and bad AI and we have a mix of…
Good humans
Bad humans
Good AI
Bad AI
Good humans + good AI
Good humans + bad AI
Bad humans + good AI
Bad humans + bad AI
The only one I would worry about is…
Bad humans + good AI. That’s the lawful evil stuff. That’s how we need to think about avoiding really bad outcomes.
Luckily for us there has only ever been 1 "good" person to have lived, and tools arent inheritinly good or bad. No worries 🤙 mahalo. Instead of worrying, we can hope for the best and plan for the worst. Better to be a warrior in the garden than a gardener in a war. If you liked Kyle laying truth down check out Ben Zander: https://youtu.be/qTKEBygQic0
OS community already own the hardware, we don’t need exascale machines, we only need each other.
FOSS AI will result in a rich fabric of highly specialised models that perform narrow discrete tasks extremely well. Each is small can be trained and iterated cheaply with highly curated data.
These models are like pixels, put them all together and you see the full picture. That picture is how we will achieve AGI.
Trying to draw the picture monolithically is a vast problem space and you have to draw the entire thing in order to change small details.
Big tech is painting water colours, whereas FOSS is an 8K OLED.
It’s on device training of small models that was the secret recipe. Horizontal scaling, not vertical.
I am not familiar with Apples ML workflow, but the idea of FIL is that you can train models in a distributed way without having to pass data between devices, only device specific estimates of model parameters. This process has improved data privacy as no remote acces to the data is needed, reduces bandwit for distributed computing, and it has good theorethical convergence garanties, in the sense that the final solution shared betwen devices will be as good as if you optimize the model on a single devices having access to all the data. It is rather new paradigm so there is a lot of place for improvement.