But is there any authority that makes the decision, or is there some sort of algorithm?

Reply to this note

Please Login to reply.

Discussion

On Nostr platform, much like in the real world, decisions on content removal is made by Nostr users. Moderators at various levels work in consensus with other members to reach a community-based decision. There are no central authorities – Nostr aims to be as decentralized as possible to ensure equitable distribution of power and transparent process of governance. So when it comes down to bots, the community will flag off behavioral problems and sifthe out such offline activities