I don't know if I'll regret this post but I just can't keep it to myself. I've been surfing global since Dec. 2022 and I've seen all kinds of stupid, weird and distasteful stuff but I've just seen some AI generated kiddie porn and it has freaked me the fuck out. Not sure how the West is going with this type of stuff but I think it's totally fukin not right and seeing it on nostr is upsetting to me. Sure I can block it but I don't think there should be anything like that anywhere near this protocol.

Reply to this note

Please Login to reply.

Discussion

1) it’s likely the Fediverse

2) you’re right to be freaked out

3). Algos have to come. This isn’t a “see it an mute it situation”

You can report the user, it’s up to the relay operator to decide whether they are allowed to post that sort of material to the relay. I’m guessing most wouldn’t want that to be posted to theirs

which note was it?

No one is going to link to it. What’s wrong with you?

just asking which note it was, for me to judge for myself, what's wrong with you?

You need to see it?

Really?

Fed? Is that you?

feds have a way of knowing what it was, even if it was deleted, no need to ask anyone. one's crass attitude towards respectful people asking respectfully shows their poor breeding

“Respectfully” asking for a link to CP is anything but respectful.

Perv

A few options:

Filter public relays from global. Public relays will always have garbage like this just from trolls and feds.

If people want to be a part of the solution, i recommend this yesterday:

nostr:note1thc256k4rycz9kh79ufw2zhss3hqwkwl7us4qrv0wh6uy2ph6t9qm2f2fj

If clients had ways of alerting image hosts of the content they are hosting, they might have more pressure to actually fix the problem by screening uploads.

What if they just move to other image hosts that don't use such a system or they self host the images?

Damus is just a browser, it is not designed to police the internet. Is there an expectation that google chrome detects and blocks this stuff? Maybe once edge ai gets good enough we can detect and hide it automatically

It would be hard to stop client side. Relay note filtering would probably be better than a report system since images can always find new hosts.

I’ve never tried searching for it but I imagine it does filter and block it in some way.

Would have to, no?

No. Chrome does not “have to”

It obviously should

With true decentralization like #Bitcoin & #Nostr, you've gotta be willing to take the good with the bad.

That is the trade off people need to be willing to make.

If you can censor anything, you can censor everything.

Sorry you had to see that. I just choose not to surf global for that reason.

Relays could use AI to censor that sort of content though.

It's the job of cops to go after child abusers. If you see something bad, denounce it.

Asking to throttle down everybody's ability to use a protocol without restriction because *someone* is using it to do illegal or morally reprehensible stuff through it is the same anti-bitcoin people, or anti-gun people do.

If your note said "I don't think there should be anything like that anywhere this [public road system]" because you know that child traffickers use roads, it would make as much sense. What we do with child traffickers when we find them on the road is go after them, not after whoever built the road.

that's the downside of being censorship resistant and open. It's not a protocol job to police people, we have actual police for that which if they actually did their job we wouldn't see any kiddie porn.

You can either have censorhip resistant protocol or censorship. You can never limit it to just the stuff you want to censor. If you provide a way, it will inevitably spread and grow.

I agree that that shit is likely disturbing (tho I am thoroughly desensitized, so not all that much to me), but you cannot remove it from Nostr and still have Nostr.

You as a user can mute/report it. Client devs can develop algos to automate cleansing of such content and relay operators can do that too.

What you cannot do is get rid of it at a protocol level.

I don't know the solution for nostr except for relays and image hosts to monitor and remove stuff. Sucks that this is a problem, it's absolutely evil.

It goes way deeper than nostr though. For us, the main thing is that it puts operators in a dicey legal position. This is bad enough, but manageable. But removing CSAM from nostr doesn't solve the root of the problem, which is that it exists — and governments protect and propagate it. I've heard good things about Sound of Freedom. Exposing this stuff is a step towards eradicating it.

If clients only show your follows follows, it solves this without an algorithm.

As nobody follows that, and anyone who does burns their reputation and does would be immediately unfollowed / ex communicated.