Global Feed Post Login
Replying to Avatar Vitor Pamplona

Frankly, I am just trying a bunch of stuff/libraries/demo apps to see where we are at with local LLMs. Most of the stuff I am seeing are just very poor ports of server runtimes, which is terrible.

Avatar
b'TC.py 10mo ago

https://www.reddit.com/r/LocalLLaMA/comments/1fhd28v/ollama_in_termux/?rdt=50828

Reply to this note

Please Login to reply.

Discussion

No replies yet.