Checkout Ollama. Not the best performance but it's rather easy to use and will work even if you don't have much VRAM/GPU compute.
Checkout Ollama. Not the best performance but it's rather easy to use and will work even if you don't have much VRAM/GPU compute.
No replies yet.