Do Nostr relays have similar terms of service to protect user information from the use to train LLM?

Do Nostr relays have similar terms of service to protect user information from the use to train LLM?

I do not see how that would work since anyone could just run their own relay with whatever policy they desire. I assume everything we post is public. It would be different for images, audio and video though since they are hosted elsewhere and not directly stored on the relays.
I think a relay would have to announce the term of service on its website to be legally binding. As the term of use in https://nostr.land/terms
nostr:nprofile1qqs99d9qw67th0wr5xh05de4s9k0wjvnkxudkgptq8yg83vtulad30gxyk5sf could you think adding such a part about the use of data for language models as well? Or you think this would not be in line with your philosophy?
Good point.
So there are 2 concerns, the relay operator (me) and users of the relay.
I can and will add terms limiting how data on the relay can be used by readers.
For the relay operator part, it gets complicated.
While I do not intend on training LLMs or sell it to people that do so, I am intending to do my own ML models for content moderation. So I am not sure how to word it in a way that wouldn’t end up being in a gray area.
There’s also the fact that other relays would have to enforce the same policy as well.
Thank you for your response. I knew I would learn something from the answer.
The first part with how to phrase it so a contentmoderation bot can be feeded the relays data but not an external model can be legally difficult I understand. And I am happy, when you are willing to train a model to help with content moderation in a moral mannar.
And I think as I read the terms of use in Mastodon, it is not about how to enforce it on others or if other relays work at the same respect. When there is a market of "terms of service" so users can choose between different relays to directly upload their notes, I think this already has value.
It is clear, that someone can populate the note to a relay with conflicting terms of service. By the nature of Nostr this is not to be prevented.
Maby there could also be a way, that users can set a flat "my notes are free to be used for training language models". So when enough clients implement this flag, training could be applied in an opt-in manner, which would be morally advanced.
nostr:nprofile1qqsf03c2gsmx5ef4c9zmxvlew04gdh7u94afnknp33qvv3c94kvwxgsm3u0w6 or nostr:nprofile1qqsyvrp9u6p0mfur9dfdru3d853tx9mdjuhkphxuxgfwmryja7zsvhqelpt5w is there already something similar available in an existing NIP?
Do relays even have a ToS at all? They certainly never asked me to one, and I can't stop other people from broadcasting my notes to relays I avoid.
More importantly, though, have AI developers ever demonstrated respect for a ToS? Facebook was pirating media to train with, and website are implenting a minor proof of work task to view pages to discourage AI scrapers that ignore robots.txt. I think the only time they even acknowledge a ToS is as training material.
Can be true. But I think rules still matter. Maby when not today in 5 or 10 years people are made accountable for their criminal actions.
I don't think a ToS is really enforceable at the relay level. There's no real control over which relays get your content. Even if you could make AI scrapers respect the ToS, you can't reasonably expect the content you're protecting to not spread to other relays. You yourself can only submit your content to relays you think will respect your choices, but you can't stop users, who may be legit buy naive users, bots, or shady scrapers circumventing rules, from simply broadcasting your notes to new relays. There are no real control mechanisms for that. The only reasonable assumption is that anything put on the network can be freely scraped and copied around.
no, it's basically designed from the ground up to be easily scrapable by bots...fortunately nobody wants to do that because most of nostr is already bots anyway
I think you are mixing the question of what is possible with what ix legally allowed. Terms of service is a legal framework. Not a technical one.