Replying to Avatar TheGrinder

When nostr:npub1r0rs5q2gk0e3dk3nlc7gnu378ec6cnlenqp8a3cjhyzu6f8k5sgs4sq9ac I come up with a lot of ideas he forgot to tell you about the "relay alert" to trigger a consensus vote across the network (relay operators) when unacceptable content spreads across the protocol.

Yes, the child stuff kind of content or anything else posing a risk to the wellbeing of any nostr user aka an imminent threat to people's life.

We will need such a mechanism at one point or we'll risk sacrificing the censorship resistance of the protocol with a rushed solution such a small group gaining moderation / ban powers or developers and relay operators banning content case by case.

There is also the risk of state staged attacks on the network if nostr catches on, onboards over 250 million people. It's easy to flood the content with inappropriate content only to then point the finger and shout "look, only podophiles use nostr".

We have to be ready for these kind of things so we should start working on it now so that we're ready when the time has come.

1. algorithms to filter out content which can be enabled in-app by users to customise their nostr experience including "family friendly feeds".

2. An alert mechanism. If certain contents appears on the network the app who detects that content first would trigger the beforementioned "relay alert" prompting ALL other relays to participate in a consensus vote to ban the relays and related users from the network. At the same time the relay operators my decide if the information of the relays should be forwarded to law enforcement. Again, one of the worst case scenarios, children stuff. There is no excuse to protect that kind of content or those who spread and consume that kind of content.

This is well intentioned, but completely wrong. When you take responsibility for the content, then YOU ARE responsible for it. If you people thnk that will appease State actors with bad ill against Nostr, you are absolutely dreaming.

There will always, invariably, inevitably be "bad" content that they will be able to use to go after people running relays -- if you accept that you are responsible for it.

The fight is to get court precedent overturning the ridiculous notion that a hosting service is responsible for the content someone else creates. Or at least, that a relay, which supposedly just RELAYS, is not responsible for what is relayed, like the it would make no sense to go after a town mayor because a criminal used the roads he ordered to build and is responsible to maintain.

Going down this path is short lived and will inevitable either kill the protocol, or allow bad actors (think Meta) to coopt it by pushing for ever-increasing regulation until they wash off all independent relay-runners and we're back to square one in the walled garden.

nostr:note1qp3xz9h5dpcnhjgmzsprvylfja60r7hcnmw00seet4lr4vryafuscqxgjj

Reply to this note

Please Login to reply.

Discussion

thank you for taking the time to share your opinion in such detail. I'd love to talk some more. Would you like to jump on the stream some time as guest?

Thank you very much for the invitation. You catch me traveling for work the next two weeks though. Maybe after I'm back home.

In the meantime, if you're interested:

nostr:note106v8c8fn8je624lmnffmcvhnk9h327v2v6vensag498ggd3l6w2sawf0m7