I might add "safeguarding practices/approaches" to nostr.com, and try building a resource list for client developers
Discussion
What use case did this foundation have in mind?
It’s an oxymoron to me, that human rights activists want some 3rd party authority to censor them.
Maybe I’ve misunderstood?
You have. On safeguarding for young people for example, it's important they can use nostr clients without being exposed to inappropriate content. We in fact have the tools and permissionless development environment to make safer spaces than centralised services.
This is super interesting to me.
One version that I can envision is a build or app that is organized around a particular audience or use case with situationally appropriate relays and algorithm options.
Nettiqette was a good start, but this will require a deft balance of intent and engineering.
Of course there would always be the option for a fully open, unadulterated installation of people chose that.
I don't know the specifics there, but I doubt the issue is wanting "some 3rd party to censor them." Rather they want to use the platform in a manner that does not result in them being harassed/spammed. Moderation is very rarely about "censorship," and often about community building/community norms.