Research - Managing CSAM (Child Sexual Abuse Material) on decentralized social media protocols.

https://purl.stanford.edu/vb515nd6874

I thought this paper was worth a read. It goes in to how child sexual abuse material is discoved and shared on social media. Mostly it talks about the fediverse, nostr is mentioned only to say they’re aware of nostr and it has different issue than the fediverse. We will have to deal with this stuff, relay operators and media hosts in particular are potentially liable. Nostr.build already has a content moderation team, others are going to need to do something or risk losing their domain name like has happened to some mastodon servers.

Reply to this note

Please Login to reply.

Discussion

Another angle here is that we can probably gather quite a lot of evidence about anyone who publishes child abuse material on nostr. Making it clear that we have this capability could act as a deterrent.

A good point. 👍

How?

In Nostr, I've only seen CP created by neural networks. As long as they don't use materials with real children, I don't care. On the other hand, in a number of countries, even a picture of a naked ninja turtle will be taken by the court as child pornography. And relay admins are really risking their freedom. That's a problem.

Yeah, I think many countries would consider ML generated CSAM to be the same thing as an actual picture or video taken of a sexualized child.

UK for example: https://www.bbc.com/news/uk-65932372

And Australia it can be text, not just images or video: https://www.lexology.com/library/detail.aspx?g=be791d54-9165-4233-b55a-4b9dad5d178d

The risk for most relay operators is that people will using nostr for the discovery / connection between people who then go in to other apps / servers to actually exchange the content. Apparently it’s a kind of whackamole with different hashtags people search for this kind of content.

Thanks for addressing this. To say it's a massive concern is still an understatement. Esp since the sound of freedom movie and surrounding issue was released. (Haven't seen it but the sub matter alone is stomach churning).

Nostr is a *lot* of power and freedom and well to quote Spiderman, comes great responsibility. We need white hatters here.

My response when this topic came up on XBiz.net…

Why? No children are harmed... It's the same reason anime with child-like characters is legal most places.

I could understand a mandatory disclaimer that no children where involved in the creation of the content, but more than that is just censorship. Our (porn) industry exists because the law says you need to meet a higher bar to censor.

I feel zero need to understand other people's sexual preferences - just as I don't need other people to understand mine. All I would like is for people to be good to each other. If someone can be sexually gratified without harming someone else - great.

We don't need to know the provenance of every image to fight abuse.

This is great. Now we need people to go do this stuff in a way that ideally any client and relay can use. Also places like Nostr build. One idea would be to have a way of indicating which relays are serving this stuff. Metrics like that could potentially incentivize big relays to get this done.

Someone very dear to me is about to join a research team working on this subject. I really rather wish she wasn't tbh.

Good read, certainly opens my eyes to things I didn't know were going on.

That said, I have a hard time grasping how any node operator could implement the recommendations suggested.

Big Tech spends millions to police this stuff, many of the tools needed are financially out of reach. Even PhotoDNA which is free, is only available to "qualified" organizations.

I fear that this gatekeeping of CSAM moderation tools will be used to justify the continued centralization of the internet.

"Think of the children" has long been used as an excuse to infringe on our rights. Big Tech won't hesitate to use it to squash #Nostr in it's infancy. Too many $$$ at stake.

They won't "squash" nostr, they will just try to take control of the bulk of the relay traffic either directly or indirectly, buying out or leaning on them. There will be show trials.

Just as none of this ever stopped 4Chan or TheHive or any of the other web 2.0 type freedom of speech platforms from continuing to exist, their ability to squash Nostr is far smaller, since the protocol is designed for being able to work even from a swarm made of home user's connections and a small number of VPS systems in friendly jurisdictions.

A lot of nostriches are in for a little shock to learn that they are gonna have to work for their freedom, and take risks.

4chan is heavily moderated, they just have few rules. Home user nodes arent the simplest solution either, you run a home node how are people connecting? If I run mine behind a VPN then I need a static IP or domain name, either of which can be targeted. If I don't then the feds are kicking my door in. If I run it behind TOR then I'm screwing the network over with cat memes. 4chan is allowed to exist, like you said in the other post, the tools to hamper our activities exist already through targeting DNS. I'm certainly open to having this totally wrong, I'm thinking about ways to subvert this specific issue every day. Maybe we need some kind of decentralized DNS.

I was just saying today that I predict that the disenfranchised "alt right" who have been frozen out of social media now for about 5 years, at least, will adopt Nostr fast and make "goggles" to filter their view.

The great thing is that while you can do this to Nostr, it's simpler and easier not to, and thanks to hidden services, and soon that will include Indra, there will still be unimpeded transit of messages that would otherwise get flagged as "nazi" or "alt right" or whatever, will still exist.

They are gonna still control a lot of things via DNS, and if you are paying attention, that means that pretty soon, give it 2 years, most of Nostr will be walled silos with tiny conduits between them, and a small, underground network of relays that don't presume to curate the traffic.

I just don't accept that liability should be on relays. Period. This is never going to be ok and it's why although there will be a brief flowering, the ways to lock down most of Nostr are already ready to apply.

分散型ソーシャルメディアプロトコルにおける児童性虐コンテンツの管理に関する論文らしい。

だいたいはFediverseへの言及で、Nostrについてはまた別の問題があると述べられているのみみたいだけど、リレー管理者やメディアのホスティングをやっている人たちは何らかの対処をする必要があるだろうと。

nostr:nevent1qqs28y0thztgptz7udkqluvgvs4ctneerxxg0uj7uq5lxqwsfugnm3spzamhxue69uhkv6tvw3jhytnwdaehgu3wwa5kuegzypmvwx4w8fy378v7a3ruhgt7y2wd5sgn5zamde4wzamdwep798905qcyqqqqqqgynaav2