Also the containers are giant because Torch. Don’t bother trying to pull from my registry. The connection is limited.
If you are like me and don’t want to help Jensen Huang buy more leather jackets I’ve wrote some docker files that you can use to run popular AGI software on AMD hardware.
If you are using a fairly recent version of Linux you don’t need to install any proprietary drivers as everything is already baked into the kernel. The base kernel on the last Ubuntu LTS for example is good enough. ROCM is a runtime not a driver.
These are made to run with rootless podman containers. I’m not sure about Docker but it should work. Who the hell uses vanilla Docker in 2023 anyway?
These are:
- standard ROCm versions 5.2 and 5.4
- ROCm versions 5.2 and 5.4
- ROCm plus ROCm-enabled Torch (good starting point for A1111)
- ROCm plus experimental ROCm-enabled KoboldCPP (from https://github.com/SlyEcho/llama.cpp.git)
- ROCm plus experimental ROCM-enabled LLAMA.CPP (from https://github.com/YellowRoseCx/koboldcpp-rocm)
GGML on ROCm is very experimental and might explode on your face and burn holes on your carpet (kidding it works but sometimes the bot go insane).
You need to be on the same group that owns /dev/kfd and /dev/dri which normally is render.
I don’t think this works on APUs. AMD says it should but I’ve never managed to get satisfactory results. Might be a skill issue on my side.
Some variables might be necessary like HSA_OVERRIDE_GFX_VERSION or HIP_VISIBLE_DEVICES. I won’t help you here. Have fun.
Discussion
No replies yet.