It’s funny … there was an interview with Elon and they were talking about the massive compute expansion required for AI. And one guy pointed out how it’s going to centralize everything in a few hands and people won’t be able to do their own computing. Elon immediately said “what do they need it for?” And the third guy immediately changed the subject. It was somewhat awkward and unanswered - but tells you everything you need to know about the future.
Discussion
Bullish on decentralized solutions 😇
Cuál sería una opción viable para que cada persona ejecute su propia IA con sus propios recursos de hardware ?
Elon’s all server-side thinking I guess but Apple’s product lineup looks set to be a hero for local inference. MLX, Universal Memory and Apple Silicon make reasonably large models accessible to pro consumers and hobbyists. Exo labs + mac studios can get pretty sophisticated jobs done.
In an ideal near-future, households would have a locally run always-on model with full access to the owner’s personal data. It would act as a mediating layer to interact with frontier models to carry out long-horizon tasks for the household without leaking sensitive data to third-parties.