8GB Vram is like the barely the minimum with 12 you're ok but in other cases you going to need set some launch commandline args. https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Optimizations

Reply to this note

Please Login to reply.

Discussion

Neat, thanks! ^^

Yeah, I want to upgrade to something with more VRAM...but the GPU overlords AMD and NVIDIA haven't released a suitable consumer card. There *are* 24GB cards, but those prices make me shiver. xD So I am holding out on my 2080ti, for now. ^^

AMD for SD is discarded, they are very behind or probably we need more programmers that know and use AMD cards for SD. In the mean time Nvidia is the most suited without recurring to so much pain.

AMD's ROCm has been ... well, let's say their Github has been super active, and they have been pushing to make themselves more compatible and compliant. There are tools that translate CUDA to their system now. But the long-story-short is, that they still have quite a way to go before AI on AMD is really something "desireable". It's super unfortunate, because I'd rather not support NVIDIA - but they have what works so... x.x

Welp they have 2 years to do it before I decide what card to buy lol😅

nostr:npub1nmk2399jazpsup0vsm6dzxw7gydzm5atedj4yhdkn3yx7jh7tzpq842975 discarded for me for now they have time before decide planing to buy a new card 🤣 if there's a chance

Oh got it. Novidia owns the stage of best value for now but things will get better thanks to Fine Wine (tm) technology.

Why discarded? SD uses Torch and ROCm is supported. Works well on RDNA2.