Can you censor anyone from posting CSAM on Nostr? No.

Can you censor anyone from making a inscription CSAM on Bitcoin? No.

Why is SimpleX even worried about what some no-life pedos do on a self-hosted private group chat?

It's a battle you can't win, and is far too often used as an excuse/trojan horse to censor other things, push for anti-privacy and encryption backdoors. Look at the UK mess recently.

Reply to this note

Please Login to reply.

Discussion

so youd prefer they didnt censor CSAM?

the fact that "what about the children/terrorism" is an excise used to rationalize LE overreach

doesn't then mean that CSAM isn't a problem and people should do nothing.

You think that will stop pedos? You can't win that battle, and if you start implementing ways to censor things, down the line they will be used(abused) to censor other things. It's a slippery slope.

ok.

so you think they shouldn't censor CSAM.

which is a legit opinion people can have.

I'd personally think their methodology here

(having a bot join reported groups to look for CSAM)

is a reasonable compromise.

after all, they are relays that they are providing.

but then

i self-host my own Simplex relays. so they aren't censoring shit.

if you use *someone elses infrastructure*

don't be surprised if they put limitations on your usage.

Them using state produced csam is part of the game. They always try to poison the well.

Do I want to see this? Absolutely not. Do I value free speech higher than coming across some violent imagery? Absolutely.

It's a tool to exert power like in politics where they all are in some extortion nring regarding child molesting.

Create dirt on someone

Let media make them a public figure

Exert control

People need to grow up and accept that there will always be humans that abuse others and inconcient politicians are currently in 99.9% of the cases do to lacking awareness in the role of the abuser.

Humanities shortcomings can never be a reason to give up essential human rights that are necessary to develop individually and collectively.

I don't think it changes the actual actionable part of the discussion.

whether we suspect its state produced or not,

infrastructure operators still must decide whether to do something or not.

I agree. The model is not sufficiently decentralised if we need to rely on infrastructure providers. It could easily be more decentralised and it has the capabilities. But the current implementation doesn't encourage the widespread use. That's imo SimpleX biggest mistake that will cost it market share if not fixed.

In the end it's me the individual that wants to control what I get to see and what not with the option of changing my decision on a case by case basis.

its pretty similar to nostr tho

a bunch of default relays that most people use to exchange messages...

or LN even

New users only get to see the SimpleX servers and most newcomers don't even know they could run their own servers not to speak of choosing other servers like yours or mine.

To answer your question ask yourself how would you feel if you released & app then for a few years all you hear is only pedos are using your app & your responsible wtf you gunna do about it? The law holds you accountable because your the operator. So if you don't remove the child pornography then you can be prosecuted. But csam is totally different because csam just is a broad jargen to say anything that looks like a kid so hentai & etc even though it's legal & acceptable they just brush it all under the same blanket.

Also the eu/uk is going through a cluster of law changes screwing around so I am sure he's constantly checking with his law team what can he do while keeping anonymity. But lets be forward about something. YOU CAN HOST EVERYTHING YOURSELF!

You want to upload any hentai or stuff afraid if might be considered csam & get you blocked from group directory host your own smp, xftp, & directory bot. Then you have freedom to do whatever & your bound by nobodies terms of service or privacy policy except your own.