Is SDXL different? Torch+ROCm 5.4 always worked for me on Stable Diffusion.

Reply to this note

Please Login to reply.

Discussion

Yes, SDXL is much more detailed than Stable Diffusion 1.4/1.5, talking about ROCm honestly I can't talk for other setup .. but if you see sdwebui launch script it's full of hacks because ROCm + Pytorch combo do not work everywhere with every release..

https://github.com/AUTOMATIC1111/stable-diffusion-webui/blob/c9c8485bc1e8720aba70f029d25cba1c4abf2b5c/webui.sh

Search for "TORCH_COMMAND" to see the hacks you need to do

Yeah but it’s a model right and inferencing is done over Torch which already supports ROCm - through a special repo hence why you need to pass some special commands.

I’m curious on why the author mentions it doesn’t work with 5.4 and also why they set stuff like skip cuda when the whole point of having rocm is enabling coda-like api on amd.

I’ll test it later and see what happens.

I have been running both Stable Diffusion and Llama on an RDNA2 card since earlier this year. From my experience things are becoming easier not harder.