Replying to Avatar rabble

I shared this on the fediverse, bluesky, and twitter to get a sense of what folks thing from other communities. In particular I got feedback from one person who really knows the problem of content moderation, trust, and safety.

Yoel Roth, who ran twitter’s trust and safety team up until Elon Musk took over. He said

> “The taxonomy here is great. And broadly, this approach seems like the most viable option for the future of moderation (to me anyway). Feels like the missing bit is commercial: moderation has to get funded somewhere, and a B2C, paid service seems most likely to be user-focused. Make moderation the product, and get people used to paying for it.” - https://macaw.social/@yoyoel/110272952171641211

I think he’s right, we need to get the commercial model right for paying for the moderation work. It could be a crowd sourced thing, a company, a subscription service, etc… Lightning helps a ton here, because we can do easy fast cheap payments! Users can then choose which moderation service they want to use, or choose to not use any at all.

A group could even come together and setup a moderation service that was compliant with local laws. In the most extreme example, a company which provided a moderation service in China which was compliant with Chinese social media laws. If you installed a mobile app in china, it used that moderation service, locked in.

Another could be a church group which provided moderation services which met the cultural values of the group. That one doesn’t have the state, so it wouldn’t be locked for regions, but would still be very valuable to the users.

Perhaps there could be a nostr kids moderation service for users who are under 18?

Anyway, we need to find ways to fund and pay for these services. Especially since we can’t just take a cut of advertising revenue to cover the costs.

And as it’s open source and a permissionless network, what one app or user does isn’t imposed on others.

I see a huge role for NGOs in moderating Nostr. I can see a lot of organizations wanting a role if Nostr really becomes a leading social media platform. SLPC, ASACP, FSC, ACLU to name a few on the liberal side, but also their conservative counterparts. They could do fund raising to give them the money to have a voice.

Where I see corporate involvement is in “AI-ish” bots that do a “decent” job detecting certain types of content. Those won’t be perfect but they’ll cover a lot of ground quickly.

In related news… One of our challenges, which I discovered today is that nostream (and presumably other relays) won’t accept Kind 1984 events from non-paid users even though non-paid users can read from the relay. That needs to be fixed ASAP - it’s a huge legal problem for the relay operators. I would go as far as saying it should be added to NIP-56 — “paid relay operators must accept Kind 1984 events from any user if it relates to content in their relay”.

#[4]

Reply to this note

Please Login to reply.

Discussion

Lmao