Running personalized AI on servers will be a stronger centralizing force than the "Apple ecosystem". They are going to do everything to make the server-side AI win. Every major big tech is hungry for your local context. Don't let them have it.
Discussion
How?
-More open source models and applications.
-Soverign services paying for AI nonkyc.
-Agential AI's with wallets and goals. Give them a task to accomplish, dont care how but heres a wallet to interact on the internet if you need to pay for anything.
AI can run on any phone, models just need to be optimized for it.
All those models were trained on centralized data. Just because you are running it locally those not change the data collection problem. Your local models are not able to improve with your local data, you still depend on centralized data collectors to improve them.
Sure, but your phone can download public data to train your local model. The global model doesn't need your local data.
Training an LLM on a phone is, at the moment, science fiction 😅
In theory yes, that is how it could work. In practice those tools still do not exist. I think that both hardware and thinking around LLM architecture has to change before something like that becomes possible. Still, can you even in theory do better than a centralized service with humongus computational resources?
Nextcloud has a scale for ethical ai. Can help in choosing models and services
That would be fun to play around with. Any projects you know in that space you recommend checking out?
Most models can be run on desktops. Desktop apps is where anyone should start right now. In a few years, phones will likely have more exclusive memory for models than for the rest of the system. That's when things get interesting.
I just experimented with mistral and Llama3 on my start9 and my mind is blown. Lots of false results, always need to double check, but I was particularly impressed by the ability to explain scientific knowledge. Below I asked "why is the sky blue, and give me the underlying physics equations" 


I don't think you got the point. AI is awesome self hosted or api accessed there is no doubt about that. The risk is teaching AI you don't own everything about you, your needs, your friends, your anything really. The big tech will use it for their purposes and the government will use it against you.
Fair enough. Training AI will always require a huge consulting infrastructure. But at least we can solve the server hosting and api issue with open source tools. Running AI privately instead of having a giant corporation collecting prompts data is also a major win
High quality data is the primary difference between, good and bad models, of whatever people call AI these days. People will use the service that provides better results, so one really has to solve a hard problem of training models on local data, client side, somehow ensure that no one is feeding shit to those models, and after collecting information from local models provide same quality responses for everyone. At the moment the tech to compete with centrelized services is not existhing.
💯 THIS 👆
Most already have it.
Is this where something like https://tinygrad.org/#tinybox comes in?
Would Lit.OTG be relevant here?
https://popzazzle.blogspot.com/2023/08/writers-take-web-offline-with-litotg.html