Is there anything preventing this sort of thing happening on #Nostr?
Mastodon is a cesspool. https://www.theverge.com/2023/7/24/23806093/mastodon-csam-study-decentralized-network
Discussion
No, just go through my block list...
nope. we don't have moderation. anything goes. even the bad stuff.
Sort of, yes. Images are one of the Achille's heels of censorship resistance on nostr. Text notes can be forever through though the use of multiple relays, but images links are a single point of failure. The media host can delete the image or if the host itself is the bad actor the host can be taken down. For the most part so far we are only using a handful of image servers (nostr.build, void.cat, etc) nip-94/95 somewhat "solve" that but it falls into the category of:

Unless I'm wrong here and there is already a way to change where an image link is pointed?
I'm kind of curious if it might actually make it easier to find/delete CSAM images (although and other images as well)
Scrape a relay for all image links and either through AI detection, manual human scrolling, or a combination of both flag images that need to be reported. Now who is going to do this? IDK.