8GB is not much. Do you have reason for using old tesla rather then current gen consumer GPU?
I use amd's rx 7700xt and it's look like weak point is 12GB memory and not chip itself. At least for llama
8GB is not much. Do you have reason for using old tesla rather then current gen consumer GPU?
I use amd's rx 7700xt and it's look like weak point is 12GB memory and not chip itself. At least for llama
Can't fit anything bigger in the 1U case. Planning to get a 2U server dedicated to AI so I can use bigger cards. Also these are 8GB, but they're small (1 pcie slot), so you could fit 4 in a single machine.
Oh that explain it. I use my gaming rig also for ollama so server use is not come to my mind 🤣