I am sure some madlad ported llama.cpp to phones. Then, just find a teensy tiny model - maybe one of the M variations, rather than B. Should give you something - and a flat battery. x) AI workloads are hardcore...

Reply to this note

Please Login to reply.

Discussion

No replies yet.