Alright #nostr I think that folks will be headed this way shortly, more people = more problems.

This will be a problem so talking about it now is a good time. (It’s a problem everywhere online basically so Nostr won’t be unique)

This is a difficult topic so let’s start with 101: Online Child Sexual Exploitation

CSAM = Child Sexual Abuse Material (sometimes referred to as “child porn” “cp” it’s better to say CSAM) is an adult abusing a child in any imagery.

CSEM or CSE= Child Sexual Exploitation Material or Child Sexual Exploitation (Can include grooming and online enticement as well as any type of sexual exploitation of a minor online. It’s a broad term it could include online child sex trafficking where the sale of a child is happening online.)

SG-CSEM = self generated child sexual exploitation material. In other words the child is groomed or coerced into taking the imagery of themselves and sharing it. This is common with “sextortion.”

Sextortion: It’s like catfishing with the purpose of extortion for imagery, money or both. This can happen to anyone but teenage boys are highly susceptible and vulnerable.

AI generated CSE & CSAM: The AI imagery is being used in sextortion cases as well. They take a real picture of the child, add it onto AI generated CSAM or CSE and blackmail the child into sending money or imagery.

If you see anything questionable on Nostr please report it to the NCMEC CyberTipline (I’m not shilling for an organization, I don’t work for them, they are the national org that runs PhotoDNA)

https://www.missingkids.org/gethelpnow/cybertipline

If the questionable content is CSAM, CSEM or SG-CSEM is identified by NCMEC as abuse material it will be added to PhotoDNA and hashed. That hash will run on all corporate tech platforms so if it pops up again it will be removed.

PhotoDNA

https://en.wikipedia.org/wiki/PhotoDNA

There is a decent amount of tech available out there to detect and remove this content. I will caution anyone using the technology to be cautious. I have concerns sometimes. That’s all that I can say about that right now.

In closing, I believe that many of these types of issues that end up online can be prevented via education, awareness and prevention. It’s not a tech platform or protocol responsibility to end all evil in the world. When we are educated about these issues we can prevent them from happening to ourselves and our loved ones.

In the case of Nostr, if you love this house…. get ready to protect it. If you see this type of content PLEASE don’t share it because you will be spreading it but let these people know that they are not welcome here and please report the imagery to the app and the CyberTipline. 💜

Reply to this note

Please Login to reply.

Discussion

No replies yet.