I run them on my 3090 with KoboldCpp. Modern 30B's are pretty smart. But to have fun with an LLM you dont need a good gpu. 6GB of vram is enough to have fun and otherwise theres free places such as colab that can run KoboldCpp also.
Discussion
No replies yet.