I may have come off a bit strong, I think you're offering better privacy than using openai & co directly, and I think your service is great and worth paying for.

But I still believe that your service is still not something that should be used for anything personal or anything you wouldn't want to get out, if you HAVE to use LLMs for anything personal, the only real solution is local LLMs running on your own machine.

Naturally the better solution is no LLMs, but that's just my personal opinion

Reply to this note

Please Login to reply.

Discussion

Agree that some information is too sacred to give even to us. It's hard to spin up local LLM's for most unfortunately. Models with trustless execution environments are a possible great fit of convenience and privacy like nostr:nprofile1qyjhwumn8ghj7en9v4j8xtnwdaehgu3wvfskuep0dakku62ltamx2mn5w4ex2ucpxpmhxue69uhkjarrdpuj6em0d3jx2mnjdajz6en4wf3k7mn5dphhq6rpva6hxtnnvdshyctz9e5k6tcqyp7u8zl8y8yfa87nstgj2405t2shal4rez0fzvxgrseq7k60gsrx6zeuh5t .