Storage currently is x2 990Pro NVMe SSD’s. The bridges for paired 3090 gpu’s help a lot, so instead of both boards traveling back down the PCIe lanes only one needs to, enhancing speed. Cards on these 3 slot bridges will be close to each other but not touching. I will have fans under the cards aiming up (hot air rises) to get rid of heat from gpu processors as fast as possible. I see a lot of fans. But i have built similar and it should be pretty damn quiet. 24GB of vram per 3090 x 6 or more is healthy. Avoiding overly aggressive quantization, opting for higher-bit precision to preserve model fidelity.
#ai #llm #localllama #localllm #grownostr #nostr #gfy
Not your own locally hosted LLM? You are giving away your thoughts and ideas to corporations/governments, who pay for what and how you think.
https://video.nostr.build/a027a52098435f570e0340e42b93296b0b608dd462b1c3a57873c254b50289a2.mp4
Discussion
No replies yet.