I really wanna run this locally through Ollama man ðŸ˜
Wait for the extreme quantizations, which will make it less precise than a tiny llama lol
Please Login to reply.
No replies yet.