sorry, I'm specifically asking for nsfw/csam?

Reply to this note

Please Login to reply.

Discussion

Initially the image is hashed and that hash is used for the filename. The image is then scanned for csam only, if the AI or Cloudflare scanner is unsure, we will manually review. If determined csam, that image, hash, IP address, any related metadata is used to create a report sent to the NCMEC.

The databases used to determine csam are based on a hash of the image. Hope that answers your question.

Great to know that users are protected from CSAM, but for NSFW images, is there any metadata to figure out BEFORE showing it to users? I'm asking these questions because I'm working on a client.

Giving a URL of an jmage hosted on nostr.build, is there a way to figure out if is NSFW?

I need to confirm with nostr:npub137c5pd8gmhhe0njtsgwjgunc5xjr2vmzvglkgqs5sjeh972gqqxqjak37w , we were going to work with Damus on something, not sure if it got implemented.

FYI- We don’t allow porn(sex) with free uploads, naked adults are permitted.

Headers.

There is a moderation status header, not sure what is named but just request any bew image and it should show up. Note: this may take a few minutes to appear.

You can use a HEAD request to only get headers without refetching the entire image

Thank you nostr:npub12262qa4uhw7u8gdwlgmntqtv7aye8vdcmvszkqwgs0zchel6mz7s6cgrkj .

Yes just confirmed there are moderation status headers in the images.