tried with my local llama.cpp + rocm but it seems it needs "tools" support. I guess I need to use ollama
Discussion
No replies yet.
tried with my local llama.cpp + rocm but it seems it needs "tools" support. I guess I need to use ollama
No replies yet.