Replying to Avatar rabble

Yeah, I think many countries would consider ML generated CSAM to be the same thing as an actual picture or video taken of a sexualized child.

UK for example: https://www.bbc.com/news/uk-65932372

And Australia it can be text, not just images or video: https://www.lexology.com/library/detail.aspx?g=be791d54-9165-4233-b55a-4b9dad5d178d

The risk for most relay operators is that people will using nostr for the discovery / connection between people who then go in to other apps / servers to actually exchange the content. Apparently it’s a kind of whackamole with different hashtags people search for this kind of content.

My response when this topic came up on XBiz.net…

Why? No children are harmed... It's the same reason anime with child-like characters is legal most places.

I could understand a mandatory disclaimer that no children where involved in the creation of the content, but more than that is just censorship. Our (porn) industry exists because the law says you need to meet a higher bar to censor.

I feel zero need to understand other people's sexual preferences - just as I don't need other people to understand mine. All I would like is for people to be good to each other. If someone can be sexually gratified without harming someone else - great.

We don't need to know the provenance of every image to fight abuse.

Reply to this note

Please Login to reply.

Discussion

No replies yet.