A local model 14b deepseek ai that can run on a 3080 just wrote a working snake game in python, IN ONE GO.

That was a first for me. The future is amazing. All the worlds knowledge is at our finger tips.

All you have to do is ask.

Reply to this note

Please Login to reply.

Discussion

damn, now i have to try it on 8b

What GPU do you have?

running on old Tesla T4

that's one I don't hear often. it does have the memory to run 14b from what I can tell, so I'd say try that.

you are right, 14b works decently, I should switch to vllm from ollama and that should speed it up even more