I meant on android. Are you saying it can communicate with your self hosted llm that accepts ollama api calls? Would be awesome if it could run on the phone.

Reply to this note

Please Login to reply.

Discussion

your phone would talk to your ollama server yeah, it wouldn’t be executing the model on the phone, they aren’t good enough to do that yet