nostr:npub1wfvuzlcg08v2kz04pzed9jwpdv84rlxsd26eyy6q9m0ldn37swjq0hyfp8 nostr:npub1aw2wjq3pw536ql6635aq0a5sqxwlvzeguge6tkhk5mdzq4yez97sdp70dd nostr:npub108pv4cg5ag52nq082kd5leu9ffrn2gdg6g4xdwatn73y36uzplmq9uyev6 nostr:npub1ajw6axeack23437kedc8pkwghneenrkh9ljfxxgxumr6t6k4rtvqecaj8d It's ironic, isn't it? Services need money to run, of course, but child pornography is one of those things that we all should be able to set aside differences for in terms of eliminating it, and barring people from accessing such a filtering mechanism without paying is dubious at best. I don't think that forcing people to pay to use a service that keeps their platform free of CSAM indicates one is providing the service in good faith.

Reply to this note

Please Login to reply.

Discussion

Yeah, a hash-checking service that any website operator could freely utilize at no cost should be done under something like the United Nations, but that would come dangerously close to helping make the world a better place which I think is against their charter.