No, I'm only here to continue the conversation already started and be a resource for devs trying to figure out what to do with illegal content.

It seems like many of them don't know what sort of risks they're exposing users to by creating "reporting" features. Others, like nostr.build are doing the right thing by reporting stuff to NCMEC.

Coordination between service providers is challenging, which is good for censorship resistance, bad for keeping users safe from legal liability.

I've adopted the term "Trust and Safety" as my username with tongue somewhat in cheek, sort of like our revered "Nostr CEO".

Reply to this note

Please Login to reply.

Discussion

Thank you, that seems reasonable and desirable! What is your background for this? Are you paid by any group, or is this your personal goal to improve the situation with CSAM?

Our shared goals are to (actually) prevent child abuse (CSAM, trafficking, etc) without giving censorship / manipulation / meddling powers to any of ~200 governments around the world.

Especially with the history of lead individuals in various governments participating on child abuse and with history of using "well-intentioned" laws to exclude large groups of people from their freedoms (participating in economy, social life, etc.).