Replying to Avatar s3x_jay

Some of the people who've seen what #[0] & I have been proposing in "NIP-69" seem to think the objective is censorship. So to day I sat down and wrote out the bigger "vision" of where I'd like to see content moderation go on Nostr. Feel free to give it a read:

https://s3x.social/nostr-content-moderation

Just realize it's a first draft and needs work. But the point I hope I get across is that I want to see something that's individual and "bottom up". To me censorship is always top down since at the core of censorship is some authority flexing their power and enforcing their idea of what's good and bad - overriding your idea of good and bad.

Instead I want to see a cacophony of voices with individuals choosing which voices in that chaos they want to listen to for filtering their feeds. (Or they could choose to listen to none and see it all.)

But systems have to be put in place to make that a reality. It won't happen by accident.

And yes, the government will always force a certain level of censorship on us. But there are ways around that. For example our relay can't have anything related to escorting on it thanks to FOSTA/SESTA (horrible law), but people who need to do posts related to escorting could use #[1]'s relay. And that's the whole point with Nostr - it's censorship-resistant, not censorship-proof. Nothing is censorship-proof…

I shared this on the fediverse, bluesky, and twitter to get a sense of what folks thing from other communities. In particular I got feedback from one person who really knows the problem of content moderation, trust, and safety.

Yoel Roth, who ran twitter’s trust and safety team up until Elon Musk took over. He said

> “The taxonomy here is great. And broadly, this approach seems like the most viable option for the future of moderation (to me anyway). Feels like the missing bit is commercial: moderation has to get funded somewhere, and a B2C, paid service seems most likely to be user-focused. Make moderation the product, and get people used to paying for it.” - https://macaw.social/@yoyoel/110272952171641211

I think he’s right, we need to get the commercial model right for paying for the moderation work. It could be a crowd sourced thing, a company, a subscription service, etc… Lightning helps a ton here, because we can do easy fast cheap payments! Users can then choose which moderation service they want to use, or choose to not use any at all.

A group could even come together and setup a moderation service that was compliant with local laws. In the most extreme example, a company which provided a moderation service in China which was compliant with Chinese social media laws. If you installed a mobile app in china, it used that moderation service, locked in.

Another could be a church group which provided moderation services which met the cultural values of the group. That one doesn’t have the state, so it wouldn’t be locked for regions, but would still be very valuable to the users.

Perhaps there could be a nostr kids moderation service for users who are under 18?

Anyway, we need to find ways to fund and pay for these services. Especially since we can’t just take a cut of advertising revenue to cover the costs.

And as it’s open source and a permissionless network, what one app or user does isn’t imposed on others.

Reply to this note

Please Login to reply.

Discussion

I see a huge role for NGOs in moderating Nostr. I can see a lot of organizations wanting a role if Nostr really becomes a leading social media platform. SLPC, ASACP, FSC, ACLU to name a few on the liberal side, but also their conservative counterparts. They could do fund raising to give them the money to have a voice.

Where I see corporate involvement is in “AI-ish” bots that do a “decent” job detecting certain types of content. Those won’t be perfect but they’ll cover a lot of ground quickly.

In related news… One of our challenges, which I discovered today is that nostream (and presumably other relays) won’t accept Kind 1984 events from non-paid users even though non-paid users can read from the relay. That needs to be fixed ASAP - it’s a huge legal problem for the relay operators. I would go as far as saying it should be added to NIP-56 — “paid relay operators must accept Kind 1984 events from any user if it relates to content in their relay”.

#[4]

Lmao

Lol