This is great. Now we need people to go do this stuff in a way that ideally any client and relay can use. Also places like Nostr build. One idea would be to have a way of indicating which relays are serving this stuff. Metrics like that could potentially incentivize big relays to get this done.
Research - Managing CSAM (Child Sexual Abuse Material) on decentralized social media protocols.
https://purl.stanford.edu/vb515nd6874
I thought this paper was worth a read. It goes in to how child sexual abuse material is discoved and shared on social media. Mostly it talks about the fediverse, nostr is mentioned only to say they’re aware of nostr and it has different issue than the fediverse. We will have to deal with this stuff, relay operators and media hosts in particular are potentially liable. Nostr.build already has a content moderation team, others are going to need to do something or risk losing their domain name like has happened to some mastodon servers.
Discussion
No replies yet.