Frankly, I am just trying a bunch of stuff/libraries/demo apps to see where we are at with local LLMs. Most of the stuff I am seeing are just very poor ports of server runtimes, which is terrible.
https://www.reddit.com/r/LocalLLaMA/comments/1fhd28v/ollama_in_termux/?rdt=50828
Please Login to reply.
No replies yet.