Replying to Avatar rabble

Yeah, I think many countries would consider ML generated CSAM to be the same thing as an actual picture or video taken of a sexualized child.

UK for example: https://www.bbc.com/news/uk-65932372

And Australia it can be text, not just images or video: https://www.lexology.com/library/detail.aspx?g=be791d54-9165-4233-b55a-4b9dad5d178d

The risk for most relay operators is that people will using nostr for the discovery / connection between people who then go in to other apps / servers to actually exchange the content. Apparently it’s a kind of whackamole with different hashtags people search for this kind of content.

Thanks for addressing this. To say it's a massive concern is still an understatement. Esp since the sound of freedom movie and surrounding issue was released. (Haven't seen it but the sub matter alone is stomach churning).

Nostr is a *lot* of power and freedom and well to quote Spiderman, comes great responsibility. We need white hatters here.

Reply to this note

Please Login to reply.

Discussion

No replies yet.