nostr:npub1a4tgkd6mnw6565svfaqvfsxj0auyt7gdagtterm2skxnwrtr3a9sp9q9yj and how are you supposed to test your hash algorithm?
nostr:npub1u27r9h3j9pvrplaffsmpn698e8xhmuqhdgcxldv67ammql9pumnqha3qfq I think the short answer is "hashing". See, e.g.:
https://www.missingkids.org/ourwork/ncmecdata
https://www.thorn.org/reporting-child-sexual-abuse-content-shared-hash/
https://www.thorn.org/blog/hashing-detect-child-sex-abuse-imagery/
Discussion
nostr:npub1u27r9h3j9pvrplaffsmpn698e8xhmuqhdgcxldv67ammql9pumnqha3qfq
I've never tried it, but typically there are dummy/safe values for testing. Although I think a common approach for CSAM is just to feed all images to an API provider who gives you back a conclusion, in which case you test differently.