it does both
Discussion
but the client side one requires you to run an ollama/llama.cpp instance and have knowledge of how to do that
the one on the network is run by me so yeah it would be "in the cloud" aka runs on my computer.
the notedeck app version of dave runs in the client and uses any AI backend available.
I meant on android. Are you saying it can communicate with your self hosted llm that accepts ollama api calls? Would be awesome if it could run on the phone.
your phone would talk to your ollama server yeah, it wouldn’t be executing the model on the phone, they aren’t good enough to do that yet