I run them on my 3090 with KoboldCpp. Modern 30B's are pretty smart. But to have fun with an LLM you dont need a good gpu. 6GB of vram is enough to have fun and otherwise theres free places such as colab that can run KoboldCpp also.

Reply to this note

Please Login to reply.

Discussion

No replies yet.