AI can run on any phone, models just need to be optimized for it.

Reply to this note

Please Login to reply.

Discussion

All those models were trained on centralized data. Just because you are running it locally those not change the data collection problem. Your local models are not able to improve with your local data, you still depend on centralized data collectors to improve them.

Sure, but your phone can download public data to train your local model. The global model doesn't need your local data.

Training an LLM on a phone is, at the moment, science fiction 😅

In theory yes, that is how it could work. In practice those tools still do not exist. I think that both hardware and thinking around LLM architecture has to change before something like that becomes possible. Still, can you even in theory do better than a centralized service with humongus computational resources?

Nextcloud has a scale for ethical ai. Can help in choosing models and services

https://nextcloud.com/blog/nextcloud-ethical-ai-rating/

That would be fun to play around with. Any projects you know in that space you recommend checking out?

Most models can be run on desktops. Desktop apps is where anyone should start right now. In a few years, phones will likely have more exclusive memory for models than for the rest of the system. That's when things get interesting.