No, can run quantized models on one decent GPU. Like Zephyr-7b-alpha or Mistral.

Reply to this note

Please Login to reply.

Discussion

Sick so like one decent mid to high grade GUP would probably suit my needs and then just make sure I have a bunch of RAM and cores in my CPU.

GPU*

Yes, can fine tune LLMs on decent GPU with 12GB RAM like a RTX 4060 if just running smaller models (13B parameters or less) can be less beefy.

Om thanks that clears up some questions I had.