If the LLM provides an answer much better than ollama or gpt4all models, and what I need to ask is sensitive then yes.
In practice that means no, sorry.
If the LLM provides an answer much better than ollama or gpt4all models, and what I need to ask is sensitive then yes.
In practice that means no, sorry.
No replies yet.