If the LLM provides an answer much better than ollama or gpt4all models, and what I need to ask is sensitive then yes.

In practice that means no, sorry.

Reply to this note

Please Login to reply.

Discussion

No replies yet.