used open-webui for the first time with llama.cpp ROCM server + AMD RX 5700 + llama-3 8B instruct. very slick local ai workflow. speed is very fast. slightly dumber but its not too bad.
This is on a MacBook Pro?
Please Login to reply.
No replies yet.