Ok LLM experts...

If one wants to build a system to have their own chatgpt or LLM or whatever offline, what kind of equipment do you need in terms of disk storage, ram, cpu, gpu? gpu ram?

Reply to this note

Please Login to reply.

Discussion

Ryzen 9, NVIDIA 4060ti, 64gb RAM?

Bigger is always better

Depends on the models you want to run of course, I think rtx 3090 is best value with 24gb of vram

Don't need insane hard drive or CPU or ram even

3090 or 4090. Avoid AMD gpus. You can run them on less but these are the standard. I bought a 4080s for gaming and it gets the job done, but LLM's are not my primarily use-case. My 3080 Ti did great work too. I'd say 12Gb cards are acceptable for casual use and 16Gb are good. It depends on how fast you need answers and how much cumulative time you'll spend in the LLMs. And your budget.

Models are quite large and if you want to have multiple installed, start at 2Tb and work your way up. M.2 of course. 1Tb if you don't think you'll be exploring many models, or already know how large the models you desire are.

I'd crank up the ram to whatever fits your budget. 32Gb minimum imo. No reason to avoid the fast stuff these days either.

CPU I can't say makes enough difference as long as you pick something generally capable, unless you are aiming specifically for cpu-based inference. But that's Apple land for the most part, to my knowledge.