Replying to Avatar rabble

Research - Managing CSAM (Child Sexual Abuse Material) on decentralized social media protocols.

https://purl.stanford.edu/vb515nd6874

I thought this paper was worth a read. It goes in to how child sexual abuse material is discoved and shared on social media. Mostly it talks about the fediverse, nostr is mentioned only to say they’re aware of nostr and it has different issue than the fediverse. We will have to deal with this stuff, relay operators and media hosts in particular are potentially liable. Nostr.build already has a content moderation team, others are going to need to do something or risk losing their domain name like has happened to some mastodon servers.

I was just saying today that I predict that the disenfranchised "alt right" who have been frozen out of social media now for about 5 years, at least, will adopt Nostr fast and make "goggles" to filter their view.

The great thing is that while you can do this to Nostr, it's simpler and easier not to, and thanks to hidden services, and soon that will include Indra, there will still be unimpeded transit of messages that would otherwise get flagged as "nazi" or "alt right" or whatever, will still exist.

They are gonna still control a lot of things via DNS, and if you are paying attention, that means that pretty soon, give it 2 years, most of Nostr will be walled silos with tiny conduits between them, and a small, underground network of relays that don't presume to curate the traffic.

I just don't accept that liability should be on relays. Period. This is never going to be ok and it's why although there will be a brief flowering, the ways to lock down most of Nostr are already ready to apply.

Reply to this note

Please Login to reply.

Discussion

No replies yet.