Sure, but your phone can download public data to train your local model. The global model doesn't need your local data.
Discussion
Training an LLM on a phone is, at the moment, science fiction 😅
In theory yes, that is how it could work. In practice those tools still do not exist. I think that both hardware and thinking around LLM architecture has to change before something like that becomes possible. Still, can you even in theory do better than a centralized service with humongus computational resources?