I have no idea where my data is going. That’s why I refuse to use LLMs at this point.

Reply to this note

Please Login to reply.

Discussion

It’s all a matter of where the model is hosted. I wouldn’t use their service (just like I don’t use OpenAI) but I’d locally host a model, as I have in the past. I’d give code auditors time to look at this particular one, since it’s all open source.

💯🎯