It also seems to me that these organizations selling CP-detecting services to social media companies are profiting indirectly from CP in precisely the manner that this law sought to prevent.
Discussion
nostr:npub1wfvuzlcg08v2kz04pzed9jwpdv84rlxsd26eyy6q9m0ldn37swjq0hyfp8 nostr:npub1aw2wjq3pw536ql6635aq0a5sqxwlvzeguge6tkhk5mdzq4yez97sdp70dd nostr:npub108pv4cg5ag52nq082kd5leu9ffrn2gdg6g4xdwatn73y36uzplmq9uyev6 nostr:npub1ajw6axeack23437kedc8pkwghneenrkh9ljfxxgxumr6t6k4rtvqecaj8d It's ironic, isn't it? Services need money to run, of course, but child pornography is one of those things that we all should be able to set aside differences for in terms of eliminating it, and barring people from accessing such a filtering mechanism without paying is dubious at best. I don't think that forcing people to pay to use a service that keeps their platform free of CSAM indicates one is providing the service in good faith.