They go into a black hole because Nostr is too small today and we're living with these public relays anyone can write. That situation cannot work if Nostr gets bigger, there will be too much spam. Many relays will be forced to close themselves only to users that meet a certain criteria (payment, relationship, content quality, PoW, etc), then once you have that it becomes in the interest of these relays to exclude people who aren't behaving well, therefore reporting them to the relays they use becomes a meaningful act.

https://pyramid.fiatjaf.com/ has a reports page where presumably I or other users can check if some bad actor got accidentally invite and remove them, for example (not that it has ever happened).

Anyway, the crucial part is that what can be bad behavior for me may not be for you, so my relay may only accept notes full of love and kindness, but yours may accept notes full of rage and sarcasm, and both are ok -- that brings us to the next step: if a user only wants to see love and kindness they should be able to configure their stuff to only see replies that come from my relay in some situations, for example.

How all of this works in practice I cannot say for sure, but I see zero centralization pressure in any of these steps.

Reply to this note

Please Login to reply.

Discussion

So the ecosystem has to grow (adoption needs to go up) AND we need spam attacks to have the incentive to innovate away from those big public relays

I'm in violent agreement with all of that. But weren't we talking about cyberbullying? What you've described just sounds like filtering content at one's relays, not reporting or moderation.

Better UIs to surface relay filters seem like a strong advance in the direction of "be your own algorithm."

If you know your relay cares about cyberbulling under some definition of that, you can report such activity to the relay operators to have that bully kicked off the relay.

If you have a Democrat only relay, you can report Republicans.

The domain of censorship is the relay, not the entirety of nostr. The mechanisms of reporting are things like NIP-56, and the mechanisms of moderation are things like NIP-86.

We can't even really see how censorship resistant Nostr truly is until relay operators start removing stuff beyond the agreed-upon spam & other obviously bad things.

There's some funny situational irony to that. Creating unique experiences across many relays seems like the most logical way forward, to me, since that can help the user base grow, too. Anything else would just be war games.