I think you may get away with running the 7B model, maybe 13B with 2 of them, but they are quite old now so 'well' would be subjective. Depends on the CPU too, but I don't think they have enough VRAM for the 40B model.

I have a Deepseek 7B model running without a GPU while testing it out (Ryzen 9 7900), it is slow but does the job.

Reply to this note

Please Login to reply.

Discussion

No replies yet.