so you'd prefer that they didnt censor CSAM?
Discussion
Can you censor anyone from posting CSAM on Nostr? No.
Can you censor anyone from making a inscription CSAM on Bitcoin? No.
Why is SimpleX even worried about what some no-life pedos do on a self-hosted private group chat?
It's a battle you can't win, and is far too often used as an excuse/trojan horse to censor other things, push for anti-privacy and encryption backdoors. Look at the UK mess recently.
so youd prefer they didnt censor CSAM?
the fact that "what about the children/terrorism" is an excise used to rationalize LE overreach
doesn't then mean that CSAM isn't a problem and people should do nothing.
You think that will stop pedos? You can't win that battle, and if you start implementing ways to censor things, down the line they will be used(abused) to censor other things. It's a slippery slope.
ok.
so you think they shouldn't censor CSAM.
which is a legit opinion people can have.
I'd personally think their methodology here
(having a bot join reported groups to look for CSAM)
is a reasonable compromise.
after all, they are relays that they are providing.
but then
i self-host my own Simplex relays. so they aren't censoring shit.
if you use *someone elses infrastructure*
don't be surprised if they put limitations on your usage.
facts matter
Them using state produced csam is part of the game. They always try to poison the well.
Do I want to see this? Absolutely not. Do I value free speech higher than coming across some violent imagery? Absolutely.
It's a tool to exert power like in politics where they all are in some extortion nring regarding child molesting.
Create dirt on someone
Let media make them a public figure
Exert control
People need to grow up and accept that there will always be humans that abuse others and inconcient politicians are currently in 99.9% of the cases do to lacking awareness in the role of the abuser.
Humanities shortcomings can never be a reason to give up essential human rights that are necessary to develop individually and collectively.
I don't think it changes the actual actionable part of the discussion.
whether we suspect its state produced or not,
infrastructure operators still must decide whether to do something or not.
I agree. The model is not sufficiently decentralised if we need to rely on infrastructure providers. It could easily be more decentralised and it has the capabilities. But the current implementation doesn't encourage the widespread use. That's imo SimpleX biggest mistake that will cost it market share if not fixed.
In the end it's me the individual that wants to control what I get to see and what not with the option of changing my decision on a case by case basis.
To answer your question ask yourself how would you feel if you released & app then for a few years all you hear is only pedos are using your app & your responsible wtf you gunna do about it? The law holds you accountable because your the operator. So if you don't remove the child pornography then you can be prosecuted. But csam is totally different because csam just is a broad jargen to say anything that looks like a kid so hentai & etc even though it's legal & acceptable they just brush it all under the same blanket.
Also the eu/uk is going through a cluster of law changes screwing around so I am sure he's constantly checking with his law team what can he do while keeping anonymity. But lets be forward about something. YOU CAN HOST EVERYTHING YOURSELF!
You want to upload any hentai or stuff afraid if might be considered csam & get you blocked from group directory host your own smp, xftp, & directory bot. Then you have freedom to do whatever & your bound by nobodies terms of service or privacy policy except your own.
Do you know the most CSAM propagated in freedom tech comes from state agents to have an excuse to broke encryption didn't you?
nostr:note1l9wsl4u7f544n49zn9p4yejfm5zcw06ygmuydds2ta88x9vtsqpqk87uft
People who do those kind of crimes are part of the system/govs, you know that, they use the crimes they commit as excuse to limit freedom and exert control over normal people.
Do you think that measure from simpleX helps stop real CSAM really?
Do you think the ones that produce CSAM use SimpleX servers to share with their clients?
Most horrendous crimes are committed by elite people and people from governments.
Big drug operations, sex trafficking rings, child abuse, terrorism, you also know that. (Pizzagate, Epstein, Escobar, Bin Laden..)
CSAM exists otherwise also.
and regardless of source, infrastructure operators have to make a decision.
theres nothing particularly controversial about Simplexs decision here.
Of course it exists otherwise also.
For sure they need to make a decision.
They thing that upsets me is their false narrative on the why's and pro censorship language on their speech, and that is a red flag to me.
("Freedom of speech without SOME restrictions is not possible.. study history", "censorship resistsnce and privacy are contradictory")
We all understand that it isn't to figth the propagation of CSAM, but to stay compliant with law.
If law tomorrow want encryption broken like UK law on apple what do you think SimpleX will do?