Maybe ipados/ios when i get it working, but apple won’t let me zap there so dave can’t really function
Discussion
Dave runs in the cloud and charges per query rather than running client side?
it does both
but the client side one requires you to run an ollama/llama.cpp instance and have knowledge of how to do that
the one on the network is run by me so yeah it would be "in the cloud" aka runs on my computer.
the notedeck app version of dave runs in the client and uses any AI backend available.
I meant on android. Are you saying it can communicate with your self hosted llm that accepts ollama api calls? Would be awesome if it could run on the phone.
your phone would talk to your ollama server yeah, it wouldn’t be executing the model on the phone, they aren’t good enough to do that yet
Why does Apple do you like that? Other clients can zap