Will you share the setup and use cases then?
Discussion
hw setup?
probably a "standard" gaming rig but with 16g of vram.
i need to check setups which support multiple graphic cards if I need it.
well use cases ... all of them 😂
What software? How will you install? What models? What are you using LLMs for mainly?
I'll go with ollama + open webui.
primariliy I want to provide restricted models for my kids so it will help them with education but disallow them to cheat.
then i want some good model for coding, i need to find some which will fit the best.
I plan to setup my mcps to help me with clojure, java and golang development.
I hope I can extend the models with my personal notes I've been compiling for ober a decade.
Look, I don’t say I hate you, but I need to fix my car and not upgrade RAM on my homeserver…
ram is cheap, vram is what's really expensive
Yes, it is. But just out of curiosity, I have ordered RAM upgrade for my homeserver to see if it’s usable. And because I guess it is, the next step would be dedicated hardware and there we go…
I don't have any usable graphic card and CPU is basically useless for anything near serious.
If you want to run bigger models I think 23g of VRAM is a must (rtx3090, 4090, 5090).
Oldest one sucks already. I'm thinking to pair 4090 (if I can get it) with 5070ti for parallel computing (l've seen something similar working).
That's ~$5.5k (whole rig witout periferies)
Yes, that’s what I imagined. But let’s see.