There are two sides to this, the first is protecting the user from having to see the shit, and the second is tackling the content itself.

Reports are used by clients to determine which content they hide from other users, taking care of the first problem.

As for the second problem, people often forget that Nostr relays only carry text. The actual offending content is stored on other platforms such as image hosts, etc. It is those platforms that have a duty to weed out abuse images, etc.

This means Nostr as a whole is reasonably well equipped to tackle this problem without requiring centralized means of controlling and erasing content. It's critical that text can't be erased Nostr wide, that's the whole point. Nostr relays do not talk to each other.

Reply to this note

Please Login to reply.

Discussion

I wonder... If relays dont censor illegal content wont they have legal culpability? Depending on their location of course.

For instance the original post i saw talked about reporting child p*rn. If a relay didnt block/censor that would they have legal issues? Or if they allowed "hate speech" which is illegal in certain jurisdicions.

Another way to think about relays is they are the back end to clients, a bit like a distributed database.

Clients are on the hook for the content they display (with all due protections as well). Relays don't store anything other than text, so there is no child porn on relays for example, though they could store URLs to other platforms where that content is hosted.

Regardless, it is those host platforms which are sitting on the illegal media. There could potentially be words on relays which are "hate speech" or whatever, but if no clients display it then text in a database isn't illegal in and of itself.

Relays could be compelled to remove content or provide information just as anything else, but their exposure is pretty small. Compare this with Fediverse instances which federate and store everything locally, fuck what a nightmare.