ollama is also good. i think it is using GPU better than llama.cpp.

Reply to this note

Please Login to reply.

Discussion

No replies yet.