Considering buying an eGPU enclosure for two 1070s I have to run llama locally. Anyone know if that is powerful enough to have perform well? Not trying to do any training for real. #asknostr
Discussion
I haven't heard the best results from the 10x series of cards compared to the 20x series, but that's just what others have told me. I haven't priced enclosures in the past couple years, but wouldn't it be cheaper these days to pickup a used desktop rig to stuff those in? Or a retired server machine (that's what I do)
Good thinking, I'll try that first
I think you may get away with running the 7B model, maybe 13B with 2 of them, but they are quite old now so 'well' would be subjective. Depends on the CPU too, but I don't think they have enough VRAM for the 40B model.
I have a Deepseek 7B model running without a GPU while testing it out (Ryzen 9 7900), it is slow but does the job.
16GB total in SLI?
hmm - you could run a 7b quite easily on that.
Why not consider a GPU with 16GB in one unit?
BTW you need a minimum 8GB to run the 7b models, some are more optimized these days, but you can check on huggingface
I already own the 2 1070s, so I don't want to buy new hardware